Initial packaging import (from lucab/wip-bootstrap)
authorLuca Bruno <lucab@debian.org>
Wed, 19 Aug 2015 09:54:41 +0000 (11:54 +0200)
committerLuca Bruno <lucab@debian.org>
Wed, 19 Aug 2015 09:54:41 +0000 (11:54 +0200)
26 files changed:
debian/README.source [new file with mode: 0644]
debian/bootstrap.py [new file with mode: 0755]
debian/cargo.bash-completion [new file with mode: 0644]
debian/cargo.dirs [new file with mode: 0644]
debian/cargo.doc-base [new file with mode: 0644]
debian/cargo.docs [new file with mode: 0644]
debian/cargo.lintian-overrides [new file with mode: 0644]
debian/cargo.manpages [new file with mode: 0644]
debian/changelog [new file with mode: 0644]
debian/compat [new file with mode: 0644]
debian/control [new file with mode: 0644]
debian/copyright [new file with mode: 0644]
debian/crates.io-index [new file with mode: 0644]
debian/deps-tarball-filter.txt [new file with mode: 0644]
debian/docs [new file with mode: 0644]
debian/gbp.conf [new file with mode: 0644]
debian/install [new file with mode: 0644]
debian/make_orig_multi.sh [new file with mode: 0755]
debian/missing-sources/prism.js [new file with mode: 0644]
debian/patches/add-paths-override.patch [new file with mode: 0644]
debian/patches/remove-cargo-devdeps.patch [new file with mode: 0644]
debian/patches/remove-deps-path.patch [new file with mode: 0644]
debian/patches/series [new file with mode: 0644]
debian/rules [new file with mode: 0755]
debian/source/format [new file with mode: 0644]
debian/watch [new file with mode: 0644]

diff --git a/debian/README.source b/debian/README.source
new file mode 100644 (file)
index 0000000..45764fd
--- /dev/null
@@ -0,0 +1,34 @@
+Current packaging of cargo is highly non-standard due to the
+fact that both the language (Rust) and its package manager (Cargo)
+are young projects with a high rate of changes.
+
+Debian has not yet a packaging policy nor a clear vision on how to
+package dependencies modules ("crates") in a working and reusable.
+Moreover, the current approach to modules and registry by cargo is
+biased towards an "always-online" use.
+
+For these reasons, we currently resort to several workarounds to
+build cargo:
+ 1. we use a custom script (debian/bootstrap.py) to build a local
+    stage0, instead of downloading/embedding a snapshotted binary.
+ 2. we embed a copy of crates.io-index, to avoid downloading the
+    registry from github.
+ 3. we embed all dependencies crates, because cargo needs external 
+    modules but we are still not sure about how they will be packaged
+    in Debian.
+ 4. we generate a .cargo/config at build-time, to override paths and
+    registry.
+ 5. we create a temporary git repository at build-time for the 
+    registry, as this is needed by cargo.
+  
+As such, the original source is composed by three tarballs:
+ * cargo source
+ * crates.io-index registry (under index/)
+ * dependencies crates (under deps/), stripped of unused embedded
+   C libraries
+
+In the next future, we will try to get rid of this complex setup
+as much as possible. For the moment, we want to ship rustc+cargo
+first and postpone handling of third-party crates and apps.
+
+ -- Luca Bruno <lucab@debian.org>  Tue, 11 Aug 2015 22:57:36 +0200
diff --git a/debian/bootstrap.py b/debian/bootstrap.py
new file mode 100755 (executable)
index 0000000..25f315a
--- /dev/null
@@ -0,0 +1,1267 @@
+#!/usr/bin/env python
+"""
+About
+=====
+
+This python script is design to do the bare minimum to compile and link the
+Cargo binary for the purposes of bootstrapping itself on a new platform for
+which cross-compiling isn't possible.  I wrote this specifically to bootstrap
+Cargo on [Bitrig](https://bitrig.org).  Bitrig is a fork of OpenBSD that uses
+clang/clang++ and other BSD licensed tools instead of GNU licensed software.
+Cross compiling from another platform is extremely difficult because of the
+alternative toolchain Bitrig uses.
+
+With this script, all that should be necessary to run this is a working Rust
+toolchain, Python, and Git.
+
+This script will not set up a full cargo cache or anything.  It works by
+cloning the cargo index and then starting with the cargo dependencies, it
+recursively builds the dependency tree.  Once it has the dependency tree, it
+starts with the leaves of the tree, doing a breadth first traversal and for
+each dependency, it clones the repo, sets the repo's head to the correct
+revision and then executes the build command specified in the cargo config.
+
+This bootstrap script uses a temporary directory to store the built dependency
+libraries and uses that as a link path when linking dependencies and the
+cargo binary.  The goal is to create a statically linked cargo binary that is
+capable of being used as a "local cargo" when running the main cargo Makefiles.
+
+Dependencies
+============
+
+* pytoml -- used for parsing toml files.
+  https://github.com/avakar/pytoml
+
+* dulwich -- used for working with git repos.
+  https://git.samba.org/?p=jelmer/dulwich.git;a=summary
+
+Both can be installed via the pip tool:
+
+```sh
+sudo pip install pytoml dulwich
+```
+
+Command Line Options
+====================
+
+```
+--cargo-root <path>    specify the path to the cargo repo root.
+--target-dir <path>    specify the location to store build results.
+--crate-index <path>   path to where crates.io index shoudl be cloned
+--no-clone             don't clone crates.io index, --crate-index must point to existing clone.
+--no-clean             don't remove the folders created during bootstrapping.
+--download             only download the crates needed to bootstrap cargo.
+--graph                output dot format graph of dependencies.
+--target <triple>      build target: e.g. x86_64-unknown-bitrig
+--host <triple>        host machine: e.g. x86_64-unknown-linux-gnu
+--test-semver          triggers the execution of the Semver and SemverRange class tests.
+```
+
+The `--cargo-root` option defaults to the current directory if unspecified.  The
+target directory defaults to Python equivilent of `mktemp -d` if unspecified.
+The `--crate-index` option specifies where the crates.io index will be cloned.  Or,
+if you already have a clone of the index, the crates index should point there
+and you should also specify `--no-clone`.  The `--target` option is used to
+specify which platform you are bootstrapping for.  The `--host` option defaults
+to the value of the `--target` option when not specified.
+
+Examples
+========
+
+To bootstrap Cargo on (Bitrig)[https://bitrig.org] I followed these steps:
+
+* Cloned this [bootstrap script repo](https://github.com/dhuseby/cargo-bootstra)
+to `/tmp/bootstrap`.
+* Cloned the [crates.io index](https://github.com/rust-lang/crates.io-index)
+to `/tmp/index`.
+* Created a target folder, `/tmp/out`, for the output.
+* Cloned the (Cargo)[https://github.com/rust-lang/cargo] repo to `/tmp/cargo`.
+* Copied the bootstrap.py script to the cargo repo root.
+* Ran the bootstrap.py script like so:
+```sh
+./bootstrap.py --crate-index /tmp/index --target-dir /tmp/out --no-clone --no-clean --target x86_64-unknown-bitrig
+```
+
+After the script completed, there is a Cargo executable named `cargo-0_2_0` in
+`/tmp/out`.  That executable can then be used to bootstrap Cargo from source by
+specifying it as the `--local-cargo` option to Cargo's `./configure` script.
+"""
+
+import argparse, \
+       cStringIO, \
+       hashlib, \
+       httplib, \
+       inspect, \
+       json, \
+       os, \
+       re, \
+       shutil, \
+       subprocess, \
+       sys, \
+       tarfile, \
+       tempfile, \
+       urlparse
+import pytoml as toml
+import dulwich.porcelain as git
+
+TARGET = None
+HOST = None
+GRAPH = None
+CRATES_INDEX = 'git://github.com/rust-lang/crates.io-index.git'
+CARGO_REPO = 'git://github.com/rust-lang/cargo.git'
+CRATE_API_DL = 'https://crates.io/api/v1/crates/%s/%s/download'
+SV_RANGE = re.compile('^(?P<op>(?:\<|\>|=|\<=|\>=|\^|\~))?'
+                      '(?P<major>(?:\*|0|[1-9][0-9]*))'
+                      '(\.(?P<minor>(?:\*|0|[1-9][0-9]*)))?'
+                      '(\.(?P<patch>(?:\*|0|[1-9][0-9]*)))?'
+                      '(\-(?P<prerelease>[0-9A-Za-z-]+(\.[0-9A-Za-z-]+)*))?'
+                      '(\+(?P<build>[0-9A-Za-z-]+(\.[0-9A-Za-z-]+)*))?$')
+SEMVER = re.compile('^(?P<major>(?:0|[1-9][0-9]*))'
+                    '(\.(?P<minor>(?:0|[1-9][0-9]*)))?'
+                    '(\.(?P<patch>(?:0|[1-9][0-9]*)))?'
+                    '(\-(?P<prerelease>[0-9A-Za-z-]+(\.[0-9A-Za-z-]+)*))?'
+                    '(\+(?P<build>[0-9A-Za-z-]+(\.[0-9A-Za-z-]+)*))?$')
+BSCRIPT = re.compile('^cargo:(?P<key>([^\s=]+))(=(?P<value>.+))?$')
+BNAME = re.compile('^(lib)?(?P<name>([^_]+))(_.*)?$')
+BUILT = {}
+CRATES = {}
+UNRESOLVED = []
+PFX = []
+
+def idnt(f):
+    def do_indent(*cargs):
+        ret = f(*cargs)
+        return ret
+    return do_indent
+
+def dbgCtx(f):
+    def do_dbg(self, *cargs):
+        global PFX
+        PFX.append(self.name())
+        ret = f(self, *cargs)
+        PFX.pop()
+        return ret
+    return do_dbg
+
+def dbg(s):
+    global PFX
+    print '%s: %s' % (':'.join(PFX), s)
+
+class PreRelease(object):
+
+    def __init__(self, pr):
+        self._container = []
+        if pr is not None:
+            self._container += str(pr).split('.')
+
+    def __str__(self):
+        return '.'.join(self._container)
+
+    def __repr__(self):
+        return self._container
+
+    def __getitem__(self, key):
+        return self._container[key]
+
+    def __len__(self):
+        return len(self._container)
+
+    def __gt__(self, rhs):
+        return not ((self < rhs) or (self == rhs))
+
+    def __ge__(self, rhs):
+        return not (self < rhs)
+
+    def __le__(self, rhs):
+        return not (self > rhs)
+
+    def __eq__(self, rhs):
+        return self._container == rhs._container
+
+    def __ne__(self, rhs):
+        return not (self == rhs)
+
+    def __lt__(self, rhs):
+        if self == rhs:
+            return False
+
+        # not having a pre-release is higher precedence
+        if len(self) == 0:
+            if len(rhs) == 0:
+                return False
+            else:
+                # 1.0.0 > 1.0.0-alpha
+                return False
+        else:
+            if len(rhs) is None:
+                # 1.0.0-alpha < 1.0.0
+                return True
+
+        # if both have one, then longer pre-releases are higher precedence
+        if len(self) > len(rhs):
+            # 1.0.0-alpha.1 > 1.0.0-alpha
+            return False
+        elif len(self) < len(rhs):
+            # 1.0.0-alpha < 1.0.0-alpha.1
+            return True
+
+        # if both have the same length pre-release, must check each piece
+        # numeric sub-parts have lower precedence than non-numeric sub-parts
+        # non-numeric sub-parts are compared lexically in ASCII sort order
+        for l,r in zip(self, rhs):
+            if l.isdigit():
+                if r.isdigit():
+                    if int(l) < int(r):
+                        # 2 > 1
+                        return True
+                    elif int(l) > int(r):
+                        # 1 < 2
+                        return False
+                    else:
+                        # 1 == 1
+                        continue
+                else:
+                    # 1 < 'foo'
+                    return True
+            else:
+                if r.isdigit():
+                    # 'foo' > 1
+                    return False
+
+            # both are non-numeric
+            if l < r:
+                return True
+            elif l > r:
+                return False
+
+        raise RuntimeError('PreRelease __lt__ failed')
+
+
+class Semver(dict):
+
+    def __init__(self, sv):
+        match = SEMVER.match(str(sv))
+        if match is None:
+            raise ValueError('%s is not a valid semver string' % sv)
+
+        self._input = sv
+        self.update(match.groupdict())
+        self.prerelease = PreRelease(self['prerelease'])
+
+    def __str__(self):
+        major, minor, patch, prerelease, build = self.parts_raw()
+        s = ''
+        if major is None:
+            s += '0'
+        else:
+            s += major
+        s += '.'
+        if minor is None:
+            s += '0'
+        else:
+            s += minor
+        s += '.'
+        if patch is None:
+            s += '0'
+        else:
+            s += patch
+        if len(self.prerelease):
+            s += '-' + str(self.prerelease)
+        if build is not None:
+            s += '+' + build
+        return s
+
+    def __hash__(self):
+        return hash(str(self))
+
+    def parts(self):
+        major, minor, patch, prerelease, build = self.parts_raw()
+        if major is None:
+            major = '0'
+        if minor is None:
+            minor = '0'
+        if patch is None:
+            patch = '0'
+        return (int(major),int(minor),int(patch),prerelease,build)
+
+    def parts_raw(self):
+        return (self['major'],self['minor'],self['patch'],self['prerelease'],self['build'])
+
+    def __lt__(self, rhs):
+        lmaj,lmin,lpat,lpre,_ = self.parts()
+        rmaj,rmin,rpat,rpre,_ = rhs.parts()
+        if lmaj < rmaj:
+            return True
+        elif lmin < rmin:
+            return True
+        elif lpat < rpat:
+            return True
+        elif lpre is not None and rpre is None:
+            return True
+        elif lpre is not None and rpre is not None:
+            if self.prerelease < rhs.prerelease:
+                return True
+        return False
+
+    def __le__(self, rhs):
+        return not (self > rhs)
+
+    def __gt__(self, rhs):
+        return not ((self < rhs) or (self == rhs))
+
+    def __ge__(self, rhs):
+        return not (self < rhs)
+
+    def __eq__(self, rhs):
+        # build metadata is only considered for equality
+        lmaj,lmin,lpat,lpre,lbld = self.parts()
+        rmaj,rmin,rpat,rpre,rbld = rhs.parts()
+        return lmaj == rmaj and \
+               lmin == rmin and \
+               lpat == rpat and \
+               lpre == rpre and \
+               lbld == rbld
+
+    def __ne__(self, rhs):
+        return not (self == rhs)
+
+
+class SemverRange(dict):
+
+    def __init__(self, sv):
+        match = SV_RANGE.match(str(sv))
+        if match is None:
+            raise ValueError('%s is not a valid semver range string' % sv)
+
+        self._input = sv
+        self.update(match.groupdict())
+        self.prerelease = PreRelease(self['prerelease'])
+
+        # fix up the op
+        op = self['op']
+        if op is None:
+            if self['major'] == '*' or self['minor'] == '*' or self['patch'] == '*':
+                op = '*'
+            else:
+                # if no op was specified and there are no wildcards, then op
+                # defaults to '^'
+                op = '^'
+        else:
+            self._semver = Semver(sv[len(op):])
+
+        if op not in ('<=', '>=', '<', '>', '=', '^', '~', '*'):
+            raise ValueError('%s is not a valid semver operator' % op)
+
+        self['op'] = op
+
+    def parts_raw(self):
+        return (self['major'],self['minor'],self['patch'],self['prerelease'],self['build'])
+
+    def __str__(self):
+        major, minor, patch, prerelease, build = self.parts_raw()
+        if self['op'] == '*':
+            if self['major'] == '*':
+                return '*'
+            elif self['minor'] == '*':
+                return major + '*'
+            else:
+                return major + '.' + minor + '.*'
+        else:
+            s = self['op']
+            if major is None:
+                s += '0'
+            else:
+                s += major
+            s += '.'
+            if minor is None:
+                s += '0'
+            else:
+                s += minor
+            s += '.'
+            if patch is None:
+                s += '0'
+            else:
+                s += patch
+            if len(self.prerelease):
+                s += '-' + str(self.prerelease)
+            if build is not None:
+                s += '+' + build
+            return s
+
+    def lower(self):
+        op = self['op']
+        major,minor,patch,_,_ = self.parts_raw()
+
+        if op in ('<=', '<', '=', '>', '>='):
+            return None
+
+        if op == '*':
+            # wildcards specify a range
+            if self['major'] == '*':
+                return Semver('0.0.0')
+            elif self['minor'] == '*':
+                return Semver(major + '.0.0')
+            elif self['patch'] == '*':
+                return Semver(major + '.' + minor + '.0')
+        elif op == '^':
+            # caret specifies a range
+            if patch is None:
+                if minor is None:
+                    # ^0 means >=0.0.0 and <1.0.0
+                    return Semver(major + '.0.0')
+                else:
+                    # ^0.0 means >=0.0.0 and <0.1.0
+                    return Semver(major + '.' + minor + '.0')
+            else:
+                # ^0.0.1 means >=0.0.1 and <0.0.2
+                # ^0.1.2 means >=0.1.2 and <0.2.0
+                # ^1.2.3 means >=1.2.3 and <2.0.0
+                if int(major) == 0:
+                    if int(minor) == 0:
+                        # ^0.0.1
+                        return Semver('0.0.' + patch)
+                    else:
+                        # ^0.1.2
+                        return Semver('0.' + minor + '.' + patch)
+                else:
+                    # ^1.2.3
+                    return Semver(major + '.' + minor + '.' + patch)
+        elif op == '~':
+            # tilde specifies a minimal range
+            if patch is None:
+                if minor is None:
+                    # ~0 means >=0.0.0 and <1.0.0
+                    return Semver(major + '.0.0')
+                else:
+                    # ~0.0 means >=0.0.0 and <0.1.0
+                    return Semver(major + '.' + minor + '.0')
+            else:
+                # ~0.0.1 means >=0.0.1 and <0.1.0
+                # ~0.1.2 means >=0.1.2 and <0.2.0
+                # ~1.2.3 means >=1.2.3 and <1.3.0
+                return Semver(major + '.' + minor + '.' + patch)
+
+        raise RuntimeError('No lower bound')
+
+    def upper(self):
+        op = self['op']
+        major,minor,patch,_,_ = self.parts_raw()
+
+        if op in ('<=', '<', '=', '>', '>='):
+            return None
+
+        if op == '*':
+            # wildcards specify a range
+            if self['major'] == '*':
+                return None
+            elif self['minor'] == '*':
+                return Semver(str(int(major) + 1) + '.0.0')
+            elif self['patch'] == '*':
+                return Semver(major + '.' + str(int(minor) + 1) + '.0')
+        elif op == '^':
+            # caret specifies a range
+            if patch is None:
+                if minor is None:
+                    # ^0 means >=0.0.0 and <1.0.0
+                    return Semver(str(int(major) + 1) + '.0.0')
+                else:
+                    # ^0.0 means >=0.0.0 and <0.1.0
+                    return Semver(major + '.' + str(int(minor) + 1) + '.0')
+            else:
+                # ^0.0.1 means >=0.0.1 and <0.0.2
+                # ^0.1.2 means >=0.1.2 and <0.2.0
+                # ^1.2.3 means >=1.2.3 and <2.0.0
+                if int(major) == 0:
+                    if int(minor) == 0:
+                        # ^0.0.1
+                        return Semver('0.0.' + str(int(patch) + 1))
+                    else:
+                        # ^0.1.2
+                        return Semver('0.' + str(int(minor) + 1) + '.0')
+                else:
+                    # ^1.2.3
+                    return Semver(str(int(major) + 1) + '.0.0')
+        elif op == '~':
+            # tilde specifies a minimal range
+            if patch is None:
+                if minor is None:
+                    # ~0 means >=0.0.0 and <1.0.0
+                    return Semver(str(int(major) + 1) + '.0.0')
+                else:
+                    # ~0.0 means >=0.0.0 and <0.1.0
+                    return Semver(major + '.' + str(int(minor) + 1) + '.0')
+            else:
+                # ~0.0.1 means >=0.0.1 and <0.1.0
+                # ~0.1.2 means >=0.1.2 and <0.2.0
+                # ~1.2.3 means >=1.2.3 and <1.3.0
+                return Semver(major + '.' + str(int(minor) + 1) + '.0')
+
+        raise RuntimeError('No upper bound')
+
+    def compare(self, sv):
+        if type(sv) is not Semver:
+            sv = Semver(sv)
+
+        op = self['op']
+        major,minor,patch,_,_ = self.parts_raw()
+
+        if op == '*':
+            if self['major'] == '*':
+                return sv >= Semver('0.0.0')
+
+            return (sv >= self.lower()) and (sv < self.upper())
+        elif op == '^':
+            return (sv >= self.lower()) and (sv < self.upper())
+        elif op == '~':
+            return (sv >= self.lower()) and (sv < self.upper())
+        elif op == '<=':
+            return sv <= self._semver
+        elif op == '>=':
+            return sv >= self._semver
+        elif op == '<':
+            return sv < self._semver
+        elif op == '>':
+            return sv > self._semver
+        elif op == '=':
+            return sv == self._semver
+
+        raise RuntimeError('Semver comparison failed to find a matching op')
+
+def test_semver():
+    print '\ntesting parsing:'
+    print '"1"                    is: "%s"' % Semver("1")
+    print '"1.1"                  is: "%s"' % Semver("1.1")
+    print '"1.1.1"                is: "%s"' % Semver("1.1.1")
+    print '"1.1.1-alpha"          is: "%s"' % Semver("1.1.1-alpha")
+    print '"1.1.1-alpha.1"        is: "%s"' % Semver("1.1.1-alpha.1")
+    print '"1.1.1-alpha+beta"     is: "%s"' % Semver("1.1.1-alpha+beta")
+    print '"1.1.1-alpha.1+beta"   is: "%s"' % Semver("1.1.1-alpha.1+beta")
+    print '"1.1.1-alpha.1+beta.1" is: "%s"' % Semver("1.1.1-alpha.1+beta.1")
+
+    print '\ntesting equality:'
+    print '"1"                    == "1.0.0"                is: %s' % (Semver("1") == Semver("1.0.0"))
+    print '"1.1"                  == "1.1.0"                is: %s' % (Semver("1.1") == Semver("1.1.0"))
+    print '"1.1.1"                == "1.1.1"                is: %s' % (Semver("1.1.1") == Semver("1.1.1"))
+    print '"1.1.1-alpha"          == "1.1.1-alpha"          is: %s' % (Semver("1.1.1-alpha") == Semver("1.1.1-alpha"))
+    print '"1.1.1-alpha.1"        == "1.1.1-alpha.1"        is: %s' % (Semver("1.1.1-alpha.1") == Semver("1.1.1-alpha.1"))
+    print '"1.1.1-alpha+beta"     == "1.1.1-alpha+beta"     is: %s' % (Semver("1.1.1-alpha+beta") == Semver("1.1.1-alpha+beta"))
+    print '"1.1.1-alpha.1+beta"   == "1.1.1-alpha.1+beta"   is: %s' % (Semver("1.1.1-alpha.1+beta") == Semver("1.1.1-alpha.1+beta"))
+    print '"1.1.1-alpha.1+beta.1" == "1.1.1-alpha.1+beta.1" is: %s' % (Semver("1.1.1-alpha.1+beta.1") == Semver("1.1.1-alpha.1+beta.1"))
+
+    print '\ntesting less than:'
+    print '"1"                  < "2.0.0"              is: %s' % (Semver("1") < Semver("2.0.0"))
+    print '"1.1"                < "1.2.0"              is: %s' % (Semver("1.1") < Semver("1.2.0"))
+    print '"1.1.1"              < "1.1.2"              is: %s' % (Semver("1.1.1") < Semver("1.1.2"))
+    print '"1.1.1-alpha"        < "1.1.1"              is: %s' % (Semver("1.1.1-alpha") < Semver("1.1.1"))
+    print '"1.1.1-alpha"        < "1.1.1-beta"         is: %s' % (Semver("1.1.1-alpha") < Semver("1.1.1-beta"))
+    print '"1.1.1-1"            < "1.1.1-alpha"        is: %s' % (Semver("1.1.1-alpha") < Semver("1.1.1-beta"))
+    print '"1.1.1-alpha"        < "1.1.1-alpha.1"      is: %s' % (Semver("1.1.1-alpha") < Semver("1.1.1-alpha.1"))
+    print '"1.1.1-alpha.1"      < "1.1.1-alpha.2"      is: %s' % (Semver("1.1.1-alpha.1") < Semver("1.1.1-alpha.2"))
+    print '"1.1.1-alpha+beta"   < "1.1.1+beta"         is: %s' % (Semver("1.1.1-alpha+beta") < Semver("1.1.1+beta"))
+    print '"1.1.1-alpha+beta"   < "1.1.1-beta+beta"    is: %s' % (Semver("1.1.1-alpha+beta") < Semver("1.1.1-beta+beta"))
+    print '"1.1.1-1+beta"       < "1.1.1-alpha+beta"   is: %s' % (Semver("1.1.1-alpha+beta") < Semver("1.1.1-beta+beta"))
+    print '"1.1.1-alpha+beta"   < "1.1.1-alpha.1+beta" is: %s' % (Semver("1.1.1-alpha+beta") < Semver("1.1.1-alpha.1+beta"))
+    print '"1.1.1-alpha.1+beta" < "1.1.1-alpha.2+beta" is: %s' % (Semver("1.1.1-alpha.1+beta") < Semver("1.1.1-alpha.2+beta"))
+
+    print '\ntesting semver range parsing:'
+    print '"0"      lower: %s, upper: %s' % (SemverRange('0').lower(), SemverRange('0').upper())
+    print '"0.0"    lower: %s, upper: %s' % (SemverRange('0.0').lower(), SemverRange('0.0').upper())
+    print '"0.0.0"  lower: %s, upper: %s' % (SemverRange('0.0.0').lower(), SemverRange('0.0.0').upper())
+    print '"0.0.1"  lower: %s, upper: %s' % (SemverRange('0.0.1').lower(), SemverRange('0.0.1').upper())
+    print '"0.1.1"  lower: %s, upper: %s' % (SemverRange('0.1.1').lower(), SemverRange('0.1.1').upper())
+    print '"1.1.1"  lower: %s, upper: %s' % (SemverRange('1.1.1').lower(), SemverRange('1.1.1').upper())
+    print '"^0"     lower: %s, upper: %s' % (SemverRange('^0').lower(), SemverRange('^0').upper())
+    print '"^0.0"   lower: %s, upper: %s' % (SemverRange('^0.0').lower(), SemverRange('^0.0').upper())
+    print '"^0.0.0" lower: %s, upper: %s' % (SemverRange('^0.0.0').lower(), SemverRange('^0.0.0').upper())
+    print '"^0.0.1" lower: %s, upper: %s' % (SemverRange('^0.0.1').lower(), SemverRange('^0.0.1').upper())
+    print '"^0.1.1" lower: %s, upper: %s' % (SemverRange('^0.1.1').lower(), SemverRange('^0.1.1').upper())
+    print '"^1.1.1" lower: %s, upper: %s' % (SemverRange('^1.1.1').lower(), SemverRange('^1.1.1').upper())
+    print '"~0"     lower: %s, upper: %s' % (SemverRange('~0').lower(), SemverRange('~0').upper())
+    print '"~0.0"   lower: %s, upper: %s' % (SemverRange('~0.0').lower(), SemverRange('~0.0').upper())
+    print '"~0.0.0" lower: %s, upper: %s' % (SemverRange('~0.0.0').lower(), SemverRange('~0.0.0').upper())
+    print '"~0.0.1" lower: %s, upper: %s' % (SemverRange('~0.0.1').lower(), SemverRange('~0.0.1').upper())
+    print '"~0.1.1" lower: %s, upper: %s' % (SemverRange('~0.1.1').lower(), SemverRange('~0.1.1').upper())
+    print '"~1.1.1" lower: %s, upper: %s' % (SemverRange('~1.1.1').lower(), SemverRange('~1.1.1').upper())
+    print '"*"      lower: %s, upper: %s' % (SemverRange('*').lower(), SemverRange('*').upper())
+    print '"0.*"    lower: %s, upper: %s' % (SemverRange('0.*').lower(), SemverRange('0.*').upper())
+    print '"0.0.*"  lower: %s, upper: %s' % (SemverRange('0.0.*').lower(), SemverRange('0.0.*').upper())
+
+
+class Runner(object):
+
+    def __init__(self, c, e, cwd=None):
+        self._cmd = c
+        if type(self._cmd) is not list:
+            self._cmd = [self._cmd]
+        self._env = e
+        self._output = []
+        self._returncode = 0
+        self._cwd = cwd
+
+    def __call__(self, c, e):
+        cmd = self._cmd + c
+        env = dict(self._env, **e)
+        #dbg(' env: %s' % e)
+        dbg(' '.join(cmd))
+
+        proc = subprocess.Popen(cmd, env=env, \
+                                stdout=subprocess.PIPE, cwd=self._cwd)
+        while proc.poll() is None:
+            l = proc.stdout.readline().rstrip('\n')
+            if len(l) > 0:
+                self._output.append(l)
+                dbg(l)
+                sys.stdout.flush()
+        self._returncode = proc.wait()
+        return self._output
+
+    def output(self):
+        return self._output
+
+    def returncode(self):
+        return self._returncode
+
+class RustcRunner(Runner):
+
+    def __call__(self, c, e):
+        super(RustcRunner, self).__call__(c, e)
+        return ([], {}, {})
+
+class BuildScriptRunner(Runner):
+
+    def __call__(self, c, e):
+        super(BuildScriptRunner, self).__call__(c, e)
+
+        # parse the output for cargo: lines
+        cmd = []
+        env = {}
+        denv = {}
+        for l in self.output():
+            match = BSCRIPT.match(str(l))
+            if match is None:
+                continue
+            pieces = match.groupdict()
+            k = pieces['key']
+            v = pieces['value']
+
+            if k == 'rustc-link-lib':
+                cmd += ['-l', v]
+            elif k == 'rustc-link-search':
+                cmd += ['-L', v]
+            elif k == 'rustc-cfg':
+                cmd += ['--cfg', v]
+                env['CARGO_FEATURE_%s' % v.upper().replace('-','_')] = 1
+            else:
+                denv[k] = v
+        return (cmd, env, denv)
+
+class Crate(object):
+
+    def __init__(self, crate, ver, deps, cdir, build):
+        self._crate = str(crate)
+        self._version = Semver(ver)
+        self._dep_info = deps
+        self._dir = cdir
+        # put the build scripts first
+        self._build = filter(lambda x: x.get('type', None) == 'build_script', build)
+        # then add the lib/bin builds
+        self._build += filter(lambda x: x.get('type', None) != 'build_script', build)
+        self._resolved = False
+        self._deps = {}
+        self._refs = []
+        self._env = {}
+        self._dep_env = {}
+
+    def name(self):
+        return self._crate
+
+    def version(self):
+        return self._version
+
+    def dir(self):
+        return self._dir
+
+    def __str__(self):
+        return '%s-%s' % (self.name(), self.version())
+
+    def add_dep(self, crate, features):
+        if self._deps.has_key(str(crate)):
+            return
+
+        features = [str(x) for x in features]
+        self._deps[str(crate)] = { 'features': features }
+        crate.add_ref(self)
+
+    def add_ref(self, crate):
+        if str(crate) not in self._refs:
+            self._refs.append(str(crate))
+
+    def resolved(self):
+        return self._resolved
+
+    @dbgCtx
+    def resolve(self, tdir, idir, graph=None):
+        global CRATES
+        global UNRESOLVED
+
+        if self._resolved:
+            return
+        if CRATES.has_key(str(self)):
+            return
+
+        if self._dep_info is not None:
+            print ''
+            dbg('Resolving dependencies for: %s' % str(self))
+            for d in self._dep_info:
+                kind = d.get('kind', 'normal')
+                if kind not in ('normal', 'build'):
+                    print ''
+                    dbg('Skipping %s dep %s' % (kind, d['name']))
+                    continue
+
+                optional = d.get('optional', False)
+                if optional:
+                    print ''
+                    dbg('Skipping optional dep %s' % d['name'])
+                    continue
+
+                svr = SemverRange(d['req'])
+                print ''
+                dbg('Looking up info for %s %s' % (d['name'], str(svr)))
+                if d.get('local', None) is None:
+                    name, ver, deps, ftrs, cksum = crate_info_from_index(idir, d['name'], svr)
+                    cdir = dl_and_check_crate(tdir, name, ver, cksum)
+                    _, tver, tdeps, build = crate_info_from_toml(cdir)
+                    deps += tdeps
+                else:
+                    cdir = d['path']
+                    name, ver, deps, build = crate_info_from_toml(cdir)
+
+                try:
+                    dcrate = Crate(name, ver, deps, cdir, build)
+                    if CRATES.has_key(str(dcrate)):
+                        dcrate = CRATES[str(dcrate)]
+                    UNRESOLVED.append(dcrate)
+                    if graph is not None:
+                        print >> graph, '"%s" -> "%s";' % (str(self), str(dcrate))
+
+                except:
+                    dcrate = None
+
+                # clean up the list of features that are enabled
+                tftrs = d.get('features', [])
+                if type(tftrs) is dict:
+                    tftrs = tftrs.keys()
+                else:
+                    tftrs = filter(lambda x: len(x) > 0, tftrs)
+
+                # add 'default' if default_features is true
+                if d.get('default_features', True):
+                    tftrs.append('default')
+
+                features = []
+                if type(ftrs) is dict:
+                    # add any available features that are activated by the
+                    # dependency entry in the parent's dependency record,
+                    # and any features they depend on recursively
+                    def add_features(f):
+                        if ftrs.has_key(f):
+                            for k in ftrs[f]:
+                                # guard against infinite recursion
+                                if not k in features:
+                                    features.append(k)
+                                    add_features(k)
+                    for k in tftrs:
+                        add_features(k)
+                else:
+                    features += filter(lambda x: (len(x) > 0) and (x in tftrs), ftrs)
+
+                if dcrate is not None:
+                    self.add_dep(dcrate, features)
+
+        self._resolved = True
+        CRATES[str(self)] = self
+
+    @dbgCtx
+    def build(self, by, out_dir, features=[]):
+        global BUILT
+        global CRATES
+        global TARGET
+        global HOST
+
+        extra_filename = '-' + str(self.version()).replace('.','_')
+        output_name = self.name().replace('-','_')
+        output = os.path.join(out_dir, 'lib%s%s.rlib' % (output_name, extra_filename))
+
+        if BUILT.has_key(str(self)):
+            return ({'name':self.name(), 'lib':output}, self._env)
+
+        externs = []
+        for dep,info in self._deps.iteritems():
+            if CRATES.has_key(dep):
+                extern, env = CRATES[dep].build(self, out_dir, info['features'])
+                externs.append(extern)
+                self._dep_env[CRATES[dep].name()] = env
+
+        if os.path.isfile(output):
+            print ''
+            dbg('Skipping %s, already built (needed by: %s)' % (str(self), str(by)))
+            BUILT[str(self)] = str(by)
+            return ({'name':self.name(), 'lib':output}, self._env)
+
+        # build the environment for subcommands
+        env = dict(os.environ)
+        env['OUT_DIR'] = out_dir
+        env['TARGET'] = TARGET
+        env['HOST'] = HOST
+        env['NUM_JOBS'] = '1'
+        env['OPT_LEVEL'] = '0'
+        env['DEBUG'] = '0'
+        env['PROFILE'] = 'release'
+        env['CARGO_MANIFEST_DIR'] = self.dir()
+        env['CARGO_PKG_VERSION_MAJOR'] = self.version()['major']
+        env['CARGO_PKG_VERSION_MINOR'] = self.version()['minor']
+        env['CARGO_PKG_VERSION_PATCH'] = self.version()['patch']
+        pre = self.version()['prerelease']
+        if pre is None:
+            pre = ''
+        env['CARGO_PKG_VERSION_PRE'] = pre
+        env['CARGO_PKG_VERSION'] = str(self.version())
+        for f in features:
+            env['CARGO_FEATURE_%s' % f.upper().replace('-','_')] = '1'
+        for l,e in self._dep_env.iteritems():
+            for k,v in e.iteritems():
+                if type(v) is not str and type(v) is not unicode:
+                    v = str(v)
+                env['DEP_%s_%s' % (l.upper(), v.upper())] = v
+
+        # create the builders, build scrips are first
+        cmds = []
+        for b in self._build:
+            v = str(self._version).replace('.','_')
+            cmd = ['rustc']
+            cmd.append(os.path.join(self._dir, b['path']))
+            cmd.append('--crate-name')
+            if b['type'] == 'lib':
+                cmd.append(b['name'].replace('-','_'))
+                cmd.append('--crate-type')
+                cmd.append('lib')
+            elif b['type'] == 'build_script':
+                cmd.append('build_script_%s' % b['name'].replace('-','_'))
+                cmd.append('--crate-type')
+                cmd.append('bin')
+            else:
+                cmd.append(b['name'].replace('-','_'))
+                cmd.append('--crate-type')
+                cmd.append('bin')
+
+            for f in features:
+                cmd.append('--cfg')
+                cmd.append('feature=\"%s\"' % f)
+
+            cmd.append('-C')
+            cmd.append('extra-filename=' + extra_filename)
+
+            cmd.append('--out-dir')
+            cmd.append('%s' % out_dir)
+            cmd.append('-L')
+            cmd.append('%s' % out_dir)
+            cmd.append('-L')
+            cmd.append('%s/lib' % out_dir)
+
+            for e in externs:
+                cmd.append('--extern')
+                cmd.append('%s=%s' % (e['name'].replace('-','_'), e['lib']))
+
+            # add in the native libraries to link to
+            if b['type'] != 'build_script':
+                for l in b.get('links', []):
+                    cmd.append('-l')
+                    cmd.append(l)
+
+            # get the pkg key name
+            match = BNAME.match(b['name'])
+            if match is not None:
+                match = match.groupdict()['name'].replace('-','_')
+
+            # queue up the runner
+            cmds.append({'name':b['name'], 'env_key':match, 'cmd':RustcRunner(cmd, env)})
+
+            # queue up the build script runner
+            if b['type'] == 'build_script':
+                bcmd = os.path.join(out_dir, 'build_script_%s-%s' % (b['name'], v))
+                cmds.append({'name':b['name'], 'env_key':match, 'cmd':BuildScriptRunner(bcmd, env, self._dir)})
+
+        print ''
+        dbg('Building %s (needed by: %s)' % (str(self), str(by)))
+
+        bcmd = []
+        benv = {}
+        for c in cmds:
+            runner = c['cmd']
+
+            (c1, e1, e2) = runner(bcmd, benv)
+
+            if runner.returncode() != 0:
+                raise RuntimeError('build command failed: %s' % runner.returncode())
+
+            bcmd += c1
+            benv = dict(benv, **e1)
+
+            key = c['env_key']
+            for k,v in e2.iteritems():
+                self._env['DEP_%s_%s' % (key.upper(), k.upper())] = v
+
+            #dbg(' cmd: %s' % bcmd)
+            #dbg(' env: %s' % benv)
+            #dbg('denv: %s' % self._env)
+            #print ''
+
+        BUILT[str(self)] = str(by)
+        return ({'name':self.name(), 'lib':output}, self._env)
+
+@idnt
+def dl_crate(url, depth=0):
+    if depth > 10:
+        raise RuntimeError('too many redirects')
+
+    loc = urlparse.urlparse(url)
+    if loc.scheme == 'https':
+        conn = httplib.HTTPSConnection(loc.netloc)
+    elif loc.scheme == 'http':
+        conn = httplib.HTTPConnection(loc.netloc)
+    else:
+        raise RuntimeError('unsupported url scheme: %s' % loc.scheme)
+
+    conn.request("GET", loc.path)
+    res = conn.getresponse()
+    dbg('%sconnected to %s...%s' % ((' ' * depth), url, res.status))
+    headers = dict(res.getheaders())
+    if headers.has_key('location') and headers['location'] != url:
+        return dl_crate(headers['location'], depth + 1)
+
+    return res.read()
+
+@idnt
+def dl_and_check_crate(tdir, name, ver, cksum):
+    global CRATES
+    try:
+        cname = '%s-%s' % (name, ver)
+        cdir = os.path.join(tdir, cname)
+        if CRATES.has_key(cname):
+            dbg('skipping %s...already downloaded' % cname)
+            return cdir
+
+        if not os.path.isdir(cdir):
+            dbg('Downloading %s source to %s' % (cname, cdir))
+            dl = CRATE_API_DL % (name, ver)
+            buf = dl_crate(dl)
+            if (cksum is not None):
+                h = hashlib.sha256()
+                h.update(buf)
+                if h.hexdigest() == cksum:
+                    dbg('Checksum is good...%s' % cksum)
+                else:
+                    dbg('Checksum is BAD (%s != %s)' % (h.hexdigest(), cksum))
+
+            fbuf = cStringIO.StringIO(buf)
+            with tarfile.open(fileobj=fbuf) as tf:
+                dbg('unpacking result to %s...' % cdir)
+                tf.extractall(path=tdir)
+
+    except Exception, e:
+        self._dir = None
+        raise e
+
+    return cdir
+
+@idnt
+def crate_info_from_toml(cdir):
+    try:
+        with open(os.path.join(cdir, 'Cargo.toml'), 'rb') as ctoml:
+            #import pdb; pdb.set_trace()
+            cfg = toml.load(ctoml)
+            build = []
+            p = cfg.get('package',cfg.get('project', {}))
+            name = p.get('name', None)
+            #if name == 'num_cpus':
+            #    import pdb; pdb.set_trace()
+            ver = p.get('version', None)
+            if (name is None) or (ver is None):
+                import pdb; pdb.set_trace()
+                raise RuntimeError('invalid .toml file format')
+
+            # look for a "links" item
+            lnks = p.get('links', [])
+            if type(lnks) is not list:
+                lnks = [lnks]
+
+            # look for a "build" item
+            bf = p.get('build', None)
+
+            # if we have a 'links', there must be a 'build'
+            if len(lnks) > 0 and bf is None:
+                import pdb; pdb.set_trace()
+                raise RuntimeError('cargo requires a "build" item if "links" is specified')
+
+            # there can be target specific build script overrides
+            boverrides = {}
+            for lnk in lnks:
+                boverrides.update(cfg.get('target', {}).get(TARGET, {}).get(lnk, {}))
+
+            bmain = False
+            if bf is not None:
+                build.append({'type':'build_script', \
+                              'path':[ bf ], \
+                              'name':name.replace('-','_'), \
+                              'links': lnks, \
+                              'overrides': boverrides})
+
+            # look for libs array
+            libs = cfg.get('lib', [])
+            if type(libs) is not list:
+                libs = [libs]
+            for l in libs:
+                l['type'] = 'lib'
+                l['links'] = lnks
+                if l.get('path', None) is None:
+                    l['path'] = [ 'lib.rs' ]
+                build.append(l)
+                bmain = True
+
+            # look for bins array
+            bins = cfg.get('bin', [])
+            if type(bins) is not list:
+                bins = [bins]
+            for b in bins:
+                if b.get('path', None) is None:
+                    b['path'] = [ os.path.join('bin', '%s.rs' % b['name']), os.path.join('bin', 'main.rs'), '%s.rs' % b['name'], 'main.rs' ]
+                build.append({'type': 'bin', \
+                              'name':b['name'], \
+                              'path':b['path'], \
+                              'links': lnks})
+                bmain = True
+
+            # if no explicit directions on what to build, then add a default
+            if bmain == False:
+                build.append({'type':'lib', 'path':'lib.rs', 'name':name.replace('-','_')})
+
+            for b in build:
+                # make sure the path is a list of possible paths
+                if type(b['path']) is not list:
+                    b['path'] = [ b['path'] ]
+                bin_paths = []
+                for p in b['path']:
+                    bin_paths.append(os.path.join(cdir, p))
+                    bin_paths.append(os.path.join(cdir, 'src', p))
+
+                found_path = None
+                for p in bin_paths:
+                    if os.path.isfile(p):
+                        found_path = p
+                        break
+
+                if found_path == None:
+                    import pdb; pdb.set_trace()
+                    raise RuntimeError('could not find %s to build in %s', (build, cdir))
+                else:
+                    b['path'] = found_path
+
+            d = cfg.get('build-dependencies', {})
+            d.update(cfg.get('dependencies', {}))
+            d.update(cfg.get('target', {}).get(TARGET, {}).get('dependencies', {}))
+            deps = []
+            for k,v in d.iteritems():
+                if type(v) is not dict:
+                    deps.append({'name':k, 'req': v})
+                elif v.has_key('path'):
+                    if v.get('version', None) is None:
+                        deps.append({'name':k, 'path':os.path.join(cdir, v['path']), 'local':True, 'req':0})
+                    else:
+                        ftrs = v.get('features',[])
+                        deps.append({'name':k, 'path': v['path'], 'req':v['version'], 'features':ftrs})
+                else:
+                    ftrs = v.get('features',[])
+                    deps.append({'name':k, 'req':v['version'], 'features':ftrs})
+
+            return (name, ver, deps, build)
+
+    except Exception, e:
+        import pdb; pdb.set_trace()
+        dbg('failed to load toml file for: %s (%s)' % (cdir, str(e)))
+
+    return (None, None, [], 'lib.rs')
+
+@idnt
+def crate_info_from_index(idir, name, svr):
+    global TARGET
+
+    if len(name) == 1:
+        ipath = os.path.join(idir, '1', name)
+    elif len(name) == 2:
+        ipath = os.path.join(idir, '2', name)
+    elif len(name) == 3:
+        ipath = os.path.join(idir, '3', name[0:1], name)
+    else:
+        ipath = os.path.join(idir, name[0:2], name[2:4], name)
+
+    dbg('opening crate info: %s' % ipath)
+    dep_infos = []
+    with open(ipath, 'rb') as fin:
+        lines = fin.readlines()
+        for l in lines:
+            dep_infos.append(json.loads(l))
+
+    passed = {}
+    for info in dep_infos:
+        if not info.has_key('vers'):
+            continue
+        sv = Semver(info['vers'])
+        if svr.compare(sv):
+            passed[sv] = info
+
+    keys = sorted(passed.iterkeys())
+    best_match = keys.pop()
+    dbg('best match is %s-%s' % (name, best_match))
+    best_info = passed[best_match]
+    name = best_info.get('name', None)
+    ver = best_info.get('vers', None)
+    deps = best_info.get('deps', [])
+    ftrs = best_info.get('features', [])
+    cksum = best_info.get('cksum', None)
+
+    # only include deps without a 'target' or ones with matching 'target'
+    deps = filter(lambda x: x.get('target', TARGET) == TARGET, deps)
+
+    return (name, ver, deps, ftrs, cksum)
+
+def args_parser():
+    parser = argparse.ArgumentParser(description='Cargo Bootstrap Tool')
+    parser.add_argument('--cargo-root', type=str,  default=os.getcwd(),
+                        help="specify the cargo repo root path")
+    parser.add_argument('--target-dir', type=str, default=tempfile.mkdtemp(),
+                        help="specify the path for storing built dependency libs")
+    parser.add_argument('--crate-index', type=str, default=None,
+                        help="path to where the crate index should be cloned")
+    parser.add_argument('--target', type=str, default=None,
+                        help="target triple for machine we're bootstrapping for")
+    parser.add_argument('--host', type=str, default=None,
+                        help="host triple for machine we're bootstrapping on")
+    parser.add_argument('--test-semver', action='store_true',
+                        help="run semver parsing tests")
+    parser.add_argument('--no-clone', action='store_true',
+                        help="skip cloning crates index, --target-dir must point to an existing clone of the crates index")
+    parser.add_argument('--no-clean', action='store_true',
+                        help="don't delete the target dir and crate index")
+    parser.add_argument('--download', action='store_true',
+                        help="only download the crates needed to build cargo")
+    parser.add_argument('--graph', action='store_true',
+                        help="output a dot graph of the dependencies")
+    return parser
+
+@idnt
+def open_or_clone_repo(rdir, rurl, no_clone):
+    try:
+        repo = git.open_repo(rdir)
+        return repo
+    except:
+        repo = None
+
+    if repo is None and no_clone is False:
+        dbg('Cloning %s to %s' % (rurl, rdir))
+        return git.clone(rurl, rdir)
+
+    return repo
+
+if __name__ == "__main__":
+    try:
+        # parse args
+        parser = args_parser()
+        args = parser.parse_args()
+
+        if args.test_semver:
+            test_semver()
+            sys.exit(0)
+
+        # clone the cargo index
+        if args.crate_index is None:
+            args.crate_index = os.path.normpath(os.path.join(args.target_dir, 'index'))
+        dbg('cargo: %s, target: %s, index: %s' % \
+              (args.cargo_root, args.target_dir, args.crate_index))
+
+        TARGET = args.target
+        HOST = args.host
+        index = open_or_clone_repo(args.crate_index, CRATES_INDEX, args.no_clone)
+        cargo = open_or_clone_repo(args.cargo_root, CARGO_REPO, args.no_clone)
+
+# TODO(lucab): also check for config.json and Cargo.toml before aborting
+#        if index is None:
+#            raise RuntimeError('You must have a local clone of the crates index ' \
+#                               'or omit --no-clone to allow this script to clone ' \
+#                               'it for you.')
+#        if cargo is None:
+#            raise RuntimeError('You must have a local clone of the cargo repo '\
+#                               'so that this script can read the cargo toml file.')
+
+        if TARGET is None:
+            raise RuntimeError('You must specify the target triple of this machine')
+        if HOST is None:
+            HOST = TARGET
+
+    except Exception, e:
+        frame = inspect.trace()[-1]
+        print >> sys.stderr, "\nException:\n from %s, line %d:\n %s\n" % (frame[1], frame[2], e)
+        parser.print_help()
+        if not args.no_clean:
+            shutil.rmtree(args.target_dir)
+        sys.exit(1)
+
+    try:
+
+        # load cargo deps
+        name, ver, deps, build = crate_info_from_toml(args.cargo_root)
+        cargo_crate = Crate(name, ver, deps, args.cargo_root, build)
+        UNRESOLVED.append(cargo_crate)
+
+        if args.graph:
+            GRAPH = open(os.path.join(args.target_dir, 'deps.dot'), 'wb')
+            print >> GRAPH, "digraph %s {" % name
+
+        # resolve and download all of the dependencies
+        print ''
+        print '===================================='
+        print '===== DOWNLOADING DEPENDENCIES ====='
+        print '===================================='
+        while len(UNRESOLVED) > 0:
+            crate = UNRESOLVED.pop(0)
+            crate.resolve(args.target_dir, args.crate_index, GRAPH)
+
+        if args.graph:
+            print >> GRAPH, "}"
+            GRAPH.close()
+
+        if args.download:
+            print "done downloading..."
+            sys.exit(0)
+
+        # build cargo
+        print ''
+        print '=========================='
+        print '===== BUILDING CARGO ====='
+        print '=========================='
+        cargo_crate.build('bootstrap.py', args.target_dir)
+
+        # cleanup
+        if not args.no_clean:
+            print "cleaning up..."
+            shutil.rmtree(args.target_dir)
+        print "done"
+
+    except Exception, e:
+        frame = inspect.trace()[-1]
+        print >> sys.stderr, "\nException:\n from %s, line %d:\n %s\n" % (frame[1], frame[2], e)
+        if not args.no_clean:
+            shutil.rmtree(args.target_dir)
+        sys.exit(1)
+
+
diff --git a/debian/cargo.bash-completion b/debian/cargo.bash-completion
new file mode 100644 (file)
index 0000000..49a6f00
--- /dev/null
@@ -0,0 +1 @@
+src/etc/cargo.bashcomp.sh
diff --git a/debian/cargo.dirs b/debian/cargo.dirs
new file mode 100644 (file)
index 0000000..e772481
--- /dev/null
@@ -0,0 +1 @@
+usr/bin
diff --git a/debian/cargo.doc-base b/debian/cargo.doc-base
new file mode 100644 (file)
index 0000000..9f871df
--- /dev/null
@@ -0,0 +1,8 @@
+Document: cargo-rustdoc
+Title: Cargo RustDoc
+Abstract: The Cargo guide
+Section: Programming/Rust
+
+Format: HTML
+Index: /usr/share/doc/cargo/doc/index.html
+Files: /usr/share/doc/cargo/doc/*.html
diff --git a/debian/cargo.docs b/debian/cargo.docs
new file mode 100644 (file)
index 0000000..dcac9d9
--- /dev/null
@@ -0,0 +1 @@
+target/doc
diff --git a/debian/cargo.lintian-overrides b/debian/cargo.lintian-overrides
new file mode 100644 (file)
index 0000000..327e816
--- /dev/null
@@ -0,0 +1,2 @@
+# jquery-2.1.1 is not yet packaged
+cargo: privacy-breach-may-use-debian-package usr/share/doc/cargo/doc/*
diff --git a/debian/cargo.manpages b/debian/cargo.manpages
new file mode 100644 (file)
index 0000000..51eccee
--- /dev/null
@@ -0,0 +1 @@
+src/etc/cargo.1
diff --git a/debian/changelog b/debian/changelog
new file mode 100644 (file)
index 0000000..286f75f
--- /dev/null
@@ -0,0 +1,6 @@
+cargo (0.3.0-0~exp1) experimental; urgency=low
+
+  * Team upload.
+  * Initial Debian release. (Closes: #786432)
+
+ -- Luca Bruno <lucab@debian.org>  Tue, 11 Aug 2015 20:15:54 +0200
diff --git a/debian/compat b/debian/compat
new file mode 100644 (file)
index 0000000..ec63514
--- /dev/null
@@ -0,0 +1 @@
+9
diff --git a/debian/control b/debian/control
new file mode 100644 (file)
index 0000000..66242a9
--- /dev/null
@@ -0,0 +1,46 @@
+Source: cargo
+Section: devel
+Maintainer: Rust Maintainers <pkg-rust-maintainers@lists.alioth.debian.org>
+Uploaders: Luca Bruno <lucab@debian.org>,
+ Angus Lees <gus@debian.org>
+Priority: extra
+Build-Depends: debhelper (>= 9),
+               rustc (>= 1.1),
+               curl | wget,
+               pkg-config,
+               cmake,
+               git,
+               python-dulwich,
+               python-pytoml,
+               ca-certificates,
+               bash-completion,
+               libhttp-parser-dev,
+               libcurl4-openssl-dev,
+               libssh2-1-dev,
+               libgit2-dev,
+               libhttp-parser-dev,
+               libssl-dev,
+               zlib1g-dev
+Homepage: https://crates.io/
+Standards-Version: 3.9.6
+
+Package: cargo
+Architecture: any
+Multi-Arch: foreign
+Depends: ${shlibs:Depends}, ${misc:Depends},
+         rustc (>= 1.1),
+         binutils,
+         gcc | clang | c-compiler
+Description: Rust package manager
+ Cargo is a tool that allows Rust projects to declare their various
+ dependencies, and ensure that you'll always get a repeatable build.
+ .
+ To accomplish this goal, Cargo does four things:
+  * Introduces two metadata files with various bits of project information.
+  * Fetches and builds your project's dependencies.
+  * Invokes rustc or another build tool with the correct parameters to build 
+    your project.
+  * Introduces conventions, making working with Rust projects easier.
+ .
+ Cargo downloads your Rust project’s dependencies and compiles your
+ project.
diff --git a/debian/copyright b/debian/copyright
new file mode 100644 (file)
index 0000000..c3c380e
--- /dev/null
@@ -0,0 +1,449 @@
+Format: http://www.debian.org/doc/packaging-manuals/copyright-format/1.0/
+Upstream-Name: cargo
+Source: https://github.com/rust-lang/cargo
+
+Files: *
+Copyright: 2014 The Rust Project Developers
+License: MIT-License or Apache-2.0
+ Licensed under the Apache License, Version 2.0 <LICENSE-APACHE
+ or http://www.apache.org/licenses/LICENSE-2.0> or the MIT
+ license <LICENSE-MIT or http://opensource.org/licenses/MIT>,
+ at your option. All files in the project carrying such
+ notice may not be copied, modified, or distributed except
+ according to those terms.
+
+Files: index/*
+Copyright: 2014 The Rust Project Developers
+License: MIT-License or Apache-2.0
+Comment:
+ This is the backend content of crates.io. While the DB is probably
+ not copyrightable material per-se, the original project is dual
+ licensed as MIT/Apache-2.0. See https://github.com/rust-lang/crates.io
+
+Files: deps/*
+       deps/time-*
+       deps/libc-*
+       deps/bitflags-*
+       deps/semver-*
+       deps/log-*
+       deps/glob-*
+       deps/env_logger-*
+       deps/rustc-*
+       deps/term-*
+       deps/threadpool-*
+       deps/rustc-serialize-*
+       deps/regex-*
+Copyright: 2014-2015 The Rust Project Developers
+License: MIT-License or Apache-2.0
+Comment:
+ This is a collection of external crates embedded here to bootstrap cargo.
+ Most of them come from the original upstream Rust project, thus share the
+ same MIT/Apache-2.0 dual-license. See https://github.com/rust-lang.
+ Exceptions are noted below.
+
+Files:
+       deps/advapi32-sys-*
+       deps/kernel32-*
+       deps/winapi-*
+Copyright: 2015 Peter Atashian <retep998@gmail.com>
+License: MIT-License
+Comment: see https://github.com/retep998/winapi-rs
+
+Files: deps/url-*
+Copyright: 2015 Simon Sapin <simon.sapin@exyr.org>
+License: MIT-License or Apache-2.0
+Comment: see https://github.com/servo/rust-url
+
+Files: deps/matches-*
+Copyright: 2015 Simon Sapin <simon.sapin@exyr.org>
+License: MIT-License
+Comment: see https://github.com/SimonSapin/rust-std-candidates
+
+Files: deps/num_cpus-*
+Copyright: 2015 Sean McArthur <sean.monstar@gmail.com>
+License: MIT-License
+Comment: see https://github.com/seanmonstar/num_cpus
+
+Files: deps/strsim-*
+Copyright: 2015 Danny Guo <dannyguo91@gmail.com>
+License: MIT-License
+Comment: see https://github.com/dguo/strsim-rs
+
+Files: deps/memchr-*
+       deps/aho-corasick-*
+       deps/docopt-*
+Copyright: 2015 Andrew Gallant <jamslam@gmail.com>
+License: MIT-License or Unlicense
+Comment: see upstream projects,
+ * https://github.com/BurntSushi/rust-memchr
+ * https://github.com/BurntSushi/aho-corasick
+ * https://github.com/docopt/docopt.rs
+
+Files: deps/openssl-sys-*
+Copyright: 2015 Alex Crichton <alex@alexcrichton.com>
+           2015 Steven Fackler <sfackler@gmail.com>
+License: MIT-License
+Comment: see https://github.com/sfackler/rust-openssl
+
+Files:
+       deps/libz-sys-*
+       deps/libgit2-sys-*
+       deps/libssh2-sys-*
+       deps/miniz-sys-*
+       deps/gcc-*
+       deps/git2-*
+       deps/git2-curl-*
+       deps/filetime-*
+       deps/flate2-*
+       deps/pkg-config-*
+       deps/toml-*
+       deps/tar-*
+Copyright: 2014-2015 Alex Crichton <alex@alexcrichton.com>
+License: MIT-License or Apache-2.0
+Comment: see https://github.com/alexcrichton/
+
+Files: deps/encoding-*
+Copyright: 2015 Kang Seonghoon <public+rust@mearie.org>
+License: MIT-License
+Comment: see https://github.com/lifthrasiir/rust-encoding
+
+Files: deps/encoding-index-*
+       deps/encoding_index_tests-*
+Copyright: 2015 Kang Seonghoon <public+rust@mearie.org>
+License: CC0-1.0
+Comment: see https://github.com/lifthrasiir/rust-encoding
+
+Files: deps/miniz-sys-*/miniz.c
+Copyright: Rich Geldreich <richgel99@gmail.com>
+License: Unlicense
+
+Files: deps/libgit2-sys-*/libgit2/*
+Copyright: 2009-2012, the libgit2 contributors
+License: GPL-2 with linking exception
+
+Files: deps/libgit2-sys-*/libgit2/cmake/Modules/FindGSSAPI.cmake
+Copyright: 2013, Andreas Schneider <asn@cryptomilk.org>
+License: BSD-2-clause
+
+Files: deps/libgit2-sys-*/libgit2/include/git2/inttypes.h
+       deps/libgit2-sys-*/libgit2/include/git2/stdint.h
+Copyright: 2006, Alexander Chemeris
+License: BSD-3-clause
+
+Files: deps/libgit2-sys-*/libgit2/src/khash.h
+Copyright: 2008, 2009, 2011, Attractive Chaos <attractor@live.co.uk>
+License: MIT-License
+
+Files: deps/libgit2-sys-*/libgit2/src/xdiff/*
+Copyright: 2003-2006, Davide Libenzi
+           2003-2006, Johannes E. Schindelin
+License: LGPL-2.1+
+
+Files: deps/libgit2-sys-*/libgit2/src/xdiff/xhistogram.c
+Copyright: 2010, Google Inc and others from JGit's IP log.
+License: EDL-1.0
+
+Files: deps/libgit2-sys-*/libgit2/src/date.c
+Copyright: 2005, Linus Torvalds
+License: GPL-2 with linking exception
+
+Files: src/doc/javascripts/prism.js
+       debian/missing-sources/prism.js
+Copyright: 2015 Lea Verou
+License: MIT-License
+
+Files: debian/bootstrap.py
+Copyright: 2015 David Huseby
+License: BSD-2-clause
+Comment: See LICENSE at https://github.com/dhuseby/cargo-bootstrap/
+
+License: BSD-2-clause
+ Redistribution and use in source and binary forms, with or without
+ modification, are permitted provided that the following conditions are met:
+ .
+ 1. Redistributions of source code must retain the above copyright notice,
+    this list of conditions and the following disclaimer.
+ 2. Redistributions in binary form must reproduce the above copyright notice,
+    this list of conditions and the following disclaimer in the documentation
+    and/or other materials provided with the distribution.
+ .
+ THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
+ AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
+ IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
+ ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
+ LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
+ CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
+ SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
+ INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
+ CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
+ ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
+ POSSIBILITY OF SUCH DAMAGE.
+
+License: MIT-License
+  Permission is hereby granted, free of charge, to any person obtaining a copy
+  of this software and associated documentation files (the "Software"), to deal
+  in the Software without restriction, including without limitation the rights
+  to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+  copies of the Software, and to permit persons to whom the Software is
+  furnished to do so, subject to the following conditions:
+  .
+  The above copyright notice and this permission notice shall be included in
+  all copies or substantial portions of the Software.
+  .
+  THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+  IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+  FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+  AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+  LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+  OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
+  THE SOFTWARE.
+
+License: Apache-2.0
+ On Debian systems, see /usr/share/common-licenses/Apache-2.0 for
+ the full text of the Apache License version 2.
+
+License: Unlicense
+ This is free and unencumbered software released into the public domain.
+ .
+ Anyone is free to copy, modify, publish, use, compile, sell, or
+ distribute this software, either in source code form or as a compiled
+ binary, for any purpose, commercial or non-commercial, and by any
+ means.
+ .
+ In jurisdictions that recognize copyright laws, the author or authors
+ of this software dedicate any and all copyright interest in the
+ software to the public domain. We make this dedication for the benefit
+ of the public at large and to the detriment of our heirs and
+ successors. We intend this dedication to be an overt act of
+ relinquishment in perpetuity of all present and future rights to this
+ software under copyright law.
+ .
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
+ EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
+ MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
+ IN NO EVENT SHALL THE AUTHORS BE LIABLE FOR ANY CLAIM, DAMAGES OR
+ OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE,
+ ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
+ OTHER DEALINGS IN THE SOFTWARE.
+
+License: CC0-1.0
+ Creative Commons CC0 1.0 Universal
+ .
+ CREATIVE COMMONS CORPORATION IS NOT A LAW FIRM AND DOES NOT PROVIDE LEGAL
+ SERVICES. DISTRIBUTION OF THIS DOCUMENT DOES NOT CREATE AN ATTORNEY-CLIENT
+ RELATIONSHIP. CREATIVE COMMONS PROVIDES THIS INFORMATION ON AN "AS-IS" BASIS.
+ CREATIVE COMMONS MAKES NO WARRANTIES REGARDING THE USE OF THIS DOCUMENT OR THE
+ INFORMATION OR WORKS PROVIDED HEREUNDER, AND DISCLAIMS LIABILITY FOR DAMAGES
+ RESULTING FROM THE USE OF THIS DOCUMENT OR THE INFORMATION OR WORKS PROVIDED
+ HEREUNDER.
+ .
+ Statement of Purpose
+ .
+ The laws of most jurisdictions throughout the world automatically confer
+ exclusive Copyright and Related Rights (defined below) upon the creator and
+ subsequent owner(s) (each and all, an "owner") of an original work of
+ authorship and/or a database (each, a "Work").
+ .
+ Certain owners wish to permanently relinquish those rights to a Work for the
+ purpose of contributing to a commons of creative, cultural and scientific works
+ ("Commons") that the public can reliably and without fear of later claims of
+ infringement build upon, modify, incorporate in other works, reuse and
+ redistribute as freely as possible in any form whatsoever and for any purposes,
+ including without limitation commercial purposes. These owners may contribute
+ to the Commons to promote the ideal of a free culture and the further
+ production of creative, cultural and scientific works, or to gain reputation or
+ greater distribution for their Work in part through the use and efforts of
+ others.
+ .
+ For these and/or other purposes and motivations, and without any expectation of
+ additional consideration or compensation, the person associating CC0 with a
+ Work (the "Affirmer"), to the extent that he or she is an owner of Copyright
+ and Related Rights in the Work, voluntarily elects to apply CC0 to the Work and
+ publicly distribute the Work under its terms, with knowledge of his or her
+ Copyright and Related Rights in the Work and the meaning and intended legal
+ effect of CC0 on those rights.
+ .
+ 1. Copyright and Related Rights. A Work made available under CC0 may be
+ protected by copyright and related or neighboring rights ("Copyright and
+ Related Rights"). Copyright and Related Rights include, but are not limited to,
+ the following:
+ .
+ i. the right to reproduce, adapt, distribute, perform, display, communicate,
+ and translate a Work;
+ .
+ ii. moral rights retained by the original author(s) and/or performer(s);
+ .
+ iii. publicity and privacy rights pertaining to a person's image or likeness
+ depicted in a Work;
+ .
+ iv. rights protecting against unfair competition in regards to a Work, subject
+ to the limitations in paragraph 4(a), below;
+ .
+ v. rights protecting the extraction, dissemination, use and reuse of data in a
+ Work;
+ .
+ vi. database rights (such as those arising under Directive 96/9/EC of the
+ European Parliament and of the Council of 11 March 1996 on the legal protection
+ of databases, and under any national implementation thereof, including any
+ amended or successor version of such directive); and
+ .
+ vii. other similar, equivalent or corresponding rights throughout the world
+ based on applicable law or treaty, and any national implementations thereof.
+ .
+ 2. Waiver. To the greatest extent permitted by, but not in contravention of,
+ applicable law, Affirmer hereby overtly, fully, permanently, irrevocably and
+ unconditionally waives, abandons, and surrenders all of Affirmer's Copyright
+ and Related Rights and associated claims and causes of action, whether now
+ known or unknown (including existing as well as future claims and causes of
+ action), in the Work (i) in all territories worldwide, (ii) for the maximum
+ duration provided by applicable law or treaty (including future time
+ extensions), (iii) in any current or future medium and for any number of
+ copies, and (iv) for any purpose whatsoever, including without limitation
+ commercial, advertising or promotional purposes (the "Waiver"). Affirmer makes
+ the Waiver for the benefit of each member of the public at large and to the
+ detriment of Affirmer's heirs and successors, fully intending that such Waiver
+ shall not be subject to revocation, rescission, cancellation, termination, or
+ any other legal or equitable action to disrupt the quiet enjoyment of the Work
+ by the public as contemplated by Affirmer's express Statement of Purpose.
+ .
+ 3. Public License Fallback. Should any part of the Waiver for any reason be
+ judged legally invalid or ineffective under applicable law, then the Waiver
+ shall be preserved to the maximum extent permitted taking into account
+ Affirmer's express Statement of Purpose. In addition, to the extent the Waiver
+ is so judged Affirmer hereby grants to each affected person a royalty-free, non
+ transferable, non sublicensable, non exclusive, irrevocable and unconditional
+ license to exercise Affirmer's Copyright and Related Rights in the Work (i) in
+ all territories worldwide, (ii) for the maximum duration provided by applicable
+ law or treaty (including future time extensions), (iii) in any current or
+ future medium and for any number of copies, and (iv) for any purpose
+ whatsoever, including without limitation commercial, advertising or promotional
+ purposes (the "License"). The License shall be deemed effective as of the date
+ CC0 was applied by Affirmer to the Work. Should any part of the License for any
+ reason be judged legally invalid or ineffective under applicable law, such
+ partial invalidity or ineffectiveness shall not invalidate the remainder of the
+ License, and in such case Affirmer hereby affirms that he or she will not (i)
+ exercise any of his or her remaining Copyright and Related Rights in the Work
+ or (ii) assert any associated claims and causes of action with respect to the
+ Work, in either case contrary to Affirmer's express Statement of Purpose.
+ .
+ 4. Limitations and Disclaimers.
+ .
+ a. No trademark or patent rights held by Affirmer are waived, abandoned,
+ surrendered, licensed or otherwise affected by this document.
+ .
+ b. Affirmer offers the Work as-is and makes no representations or warranties of
+ any kind concerning the Work, express, implied, statutory or otherwise,
+ including without limitation warranties of title, merchantability, fitness for
+ a particular purpose, non infringement, or the absence of latent or other
+ defects, accuracy, or the present or absence of errors, whether or not
+ discoverable, all to the greatest extent permissible under applicable law.
+ .
+ c. Affirmer disclaims responsibility for clearing rights of other persons that
+ may apply to the Work or any use thereof, including without limitation any
+ person's Copyright and Related Rights in the Work. Further, Affirmer disclaims
+ responsibility for obtaining any necessary consents, permissions or other
+ rights required for any use of the Work.
+ .
+ d. Affirmer understands and acknowledges that Creative Commons is not a party
+ to this document and has no duty or obligation with respect to this CC0 or use
+ of the Work.
+
+License: EDL-1.0
+ This program and the accompanying materials are made available
+ under the terms of the Eclipse Distribution License v1.0 which
+ accompanies this distribution, is reproduced below, and is
+ available at http://www.eclipse.org/org/documents/edl-v10.php
+ .
+ All rights reserved.
+ .
+ Redistribution and use in source and binary forms, with or
+ without modification, are permitted provided that the following
+ conditions are met:
+ .
+ - Redistributions of source code must retain the above copyright
+   notice, this list of conditions and the following disclaimer.
+ .
+ - Redistributions in binary form must reproduce the above
+   copyright notice, this list of conditions and the following
+   disclaimer in the documentation and/or other materials provided
+   with the distribution.
+ .
+ - Neither the name of the Eclipse Foundation, Inc. nor the
+   names of its contributors may be used to endorse or promote
+   products derived from this software without specific prior
+   written permission.
+ .
+ THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND
+ CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES,
+ INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES
+ OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
+ ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR
+ CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+ SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT
+ NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
+ LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
+ CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT,
+ STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
+ ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF
+ ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+
+License: LGPL-2.1+
+ This library is free software; you can redistribute it and/or
+ modify it under the terms of the GNU Lesser General Public
+ License as published by the Free Software Foundation; either
+ version 2.1 of the License, or (at your option) any later version.
+ .
+ This library is distributed in the hope that it will be useful,
+ but WITHOUT ANY WARRANTY; without even the implied warranty of
+ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
+ Lesser General Public License for more details.
+ .
+ You should have received a copy of the GNU Lesser General Public
+ License along with this library; if not, write to the Free Software
+ Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
+ 02110-1301 USA.
+
+License: BSD-3-clause
+ All rights reserved.
+ Redistribution and use in source and binary forms, with or without
+ modification, are permitted provided that the following conditions
+ are met:
+ 1. Redistributions of source code must retain the above copyright
+ notice, this list of conditions and the following disclaimer.
+ 2. Redistributions in binary form must reproduce the above copyright
+ notice, this list of conditions and the following disclaimer in the
+ documentation and/or other materials provided with the distribution.
+ 3. Neither the name of the University nor the names of its contributors
+ may be used to endorse or promote products derived from this software
+ without specific prior written permission.
+ .
+ THIS SOFTWARE IS PROVIDED BY THE REGENTS AND CONTRIBUTORS ``AS IS'' AND
+ ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
+ IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
+ ARE DISCLAIMED. IN NO EVENT SHALL THE REGENTS OR CONTRIBUTORS BE LIABLE
+ FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
+ DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS
+ OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION)
+ HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT
+ LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY
+ OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF
+ SUCH DAMAGE.
+
+License: GPL-2 with linking exception
+ Note that the only valid version of the GPL as far as this project
+ is concerned is _this_ particular version of the license (ie v2, not
+ v2.2 or v3.x or whatever), unless explicitly otherwise stated.
+ .
+            LINKING EXCEPTION
+ .
+ In addition to the permissions in the GNU General Public License,
+ the authors give you unlimited permission to link the compiled
+ version of this library into combinations with other programs,
+ and to distribute those combinations without any restriction
+ coming from the use of this file.  (The General Public License
+ restrictions do apply in other respects; for example, they cover
+ modification of the file, and distribution when not linked into
+ a combined executable.)
+ .
+ On Debian systems, the complete text of the GNU General
+ Public License version 2 can be found in "/usr/share/common-licenses/GPL-2".
diff --git a/debian/crates.io-index b/debian/crates.io-index
new file mode 100644 (file)
index 0000000..1c34b9d
--- /dev/null
@@ -0,0 +1,5 @@
+# When snapshotted rust-lang/crates.io-index at this commit:
+COMMIT_SHA='ad9a3abfecf6340aed54a0ede0b63349e830d79b'
+
+# Download URL is:
+DOWNLOAD_URL="https://github.com/rust-lang/crates.io-index/archive/${COMMIT_SHA}.tar.gz"
diff --git a/debian/deps-tarball-filter.txt b/debian/deps-tarball-filter.txt
new file mode 100644 (file)
index 0000000..61fbcb6
--- /dev/null
@@ -0,0 +1,8 @@
+# This is a list of files and dirs that should be filtered from
+# deps tarball for copyright/duplication reasons
+curl-sys-*/curl/
+libgit2-sys-*/libgit2/deps/
+libgit2-sys-*/libgit2/examples/
+libgit2-sys-*/libgit2/tests/
+libssh2-sys-*/libssh2-*/
+libz-sys-*/src/zlib-*/
diff --git a/debian/docs b/debian/docs
new file mode 100644 (file)
index 0000000..311b170
--- /dev/null
@@ -0,0 +1 @@
+target/doc/*
diff --git a/debian/gbp.conf b/debian/gbp.conf
new file mode 100644 (file)
index 0000000..45c5c5b
--- /dev/null
@@ -0,0 +1,2 @@
+[buildpackage]
+submodules = True
diff --git a/debian/install b/debian/install
new file mode 100644 (file)
index 0000000..e2a3d3f
--- /dev/null
@@ -0,0 +1 @@
+target/x86_64-unknown-linux-gnu/release/cargo usr/bin
diff --git a/debian/make_orig_multi.sh b/debian/make_orig_multi.sh
new file mode 100755 (executable)
index 0000000..9ba211b
--- /dev/null
@@ -0,0 +1,54 @@
+#!/bin/sh
+set -e
+echo "This needs python-dulwich python-pytoml installed"
+
+TMPDIR=`mktemp -d`
+echo "Using '${TMPDIR}'..."
+cat > "${TMPDIR}/Makefile" <<'EOF'
+include /usr/share/dpkg/pkg-info.mk
+all:
+       @echo $(DEB_VERSION_UPSTREAM)
+EOF
+CARGO_VER=$(make -f "${TMPDIR}/Makefile")
+BOOTSTRAP_PY=$(find "${PWD}" -name bootstrap.py -type f)
+DEPS_FILTER=$(find "${PWD}" -name deps-tarball-filter.txt -type f)
+
+# Download cargo tarballs
+uscan --rename --force-download --destdir ${TMPDIR}
+
+# Download crates.io-index snapshotted for this cargo 
+. debian/crates.io-index
+echo "${DOWNLOAD_URL}"
+wget -O "${TMPDIR}/cargo_${CARGO_VER}.orig-index.tar.gz" "${DOWNLOAD_URL}"
+
+# Extract cargo source
+cd "${TMPDIR}"
+mkdir cargo
+tar -xaf "${TMPDIR}/cargo_${CARGO_VER}.orig.tar.gz" -C cargo --strip-components=1
+cd cargo
+
+# Extract crates.io-index snapshot
+mkdir index
+tar -xaf "${TMPDIR}/cargo_${CARGO_VER}.orig-index.tar.gz" -C index --strip-components=1
+
+# Download build-dep packages from crates.io
+# (target spec is dummy/unused here)
+mkdir deps
+${BOOTSTRAP_PY} --download \
+                --no-clean \
+                --no-clone \
+                --crate-index "${TMPDIR}/cargo/index/" \
+                --cargo-root "${TMPDIR}/cargo" \
+                --target-dir "${TMPDIR}/cargo/deps/" \
+                --target x86_64-unknown-linux-gnu
+cd deps && grep -v '^#' ${DEPS_FILTER} | xargs  -I% sh -c "rm -rf %" && cd ..
+tar -czf "${TMPDIR}/cargo_${CARGO_VER}.orig-deps.tar.gz" deps
+
+# All is good, we are done!
+echo "Your files are available at:"
+echo "${TMPDIR}/cargo_${CARGO_VER}.orig.tar.gz \\"
+echo "${TMPDIR}/cargo_${CARGO_VER}.orig-index.tar.gz \\"
+echo "${TMPDIR}/cargo_${CARGO_VER}.orig-deps.tar.gz"
+echo ""
+echo "Unpacked cargo sources are availabe under:"
+echo "${TMPDIR}/cargo/"
diff --git a/debian/missing-sources/prism.js b/debian/missing-sources/prism.js
new file mode 100644 (file)
index 0000000..b3ff71f
--- /dev/null
@@ -0,0 +1,599 @@
+/* http://prismjs.com/download.html?themes=prism&languages=markup+css+clike+javascript */
+var _self = (typeof window !== 'undefined')
+       ? window   // if in browser
+       : (
+               (typeof WorkerGlobalScope !== 'undefined' && self instanceof WorkerGlobalScope)
+               ? self // if in worker
+               : {}   // if in node js
+       );
+
+/**
+ * Prism: Lightweight, robust, elegant syntax highlighting
+ * MIT license http://www.opensource.org/licenses/mit-license.php/
+ * @author Lea Verou http://lea.verou.me
+ */
+
+var Prism = (function(){
+
+// Private helper vars
+var lang = /\blang(?:uage)?-(?!\*)(\w+)\b/i;
+
+var _ = _self.Prism = {
+       util: {
+               encode: function (tokens) {
+                       if (tokens instanceof Token) {
+                               return new Token(tokens.type, _.util.encode(tokens.content), tokens.alias);
+                       } else if (_.util.type(tokens) === 'Array') {
+                               return tokens.map(_.util.encode);
+                       } else {
+                               return tokens.replace(/&/g, '&amp;').replace(/</g, '&lt;').replace(/\u00a0/g, ' ');
+                       }
+               },
+
+               type: function (o) {
+                       return Object.prototype.toString.call(o).match(/\[object (\w+)\]/)[1];
+               },
+
+               // Deep clone a language definition (e.g. to extend it)
+               clone: function (o) {
+                       var type = _.util.type(o);
+
+                       switch (type) {
+                               case 'Object':
+                                       var clone = {};
+
+                                       for (var key in o) {
+                                               if (o.hasOwnProperty(key)) {
+                                                       clone[key] = _.util.clone(o[key]);
+                                               }
+                                       }
+
+                                       return clone;
+
+                               case 'Array':
+                                       // Check for existence for IE8
+                                       return o.map && o.map(function(v) { return _.util.clone(v); });
+                       }
+
+                       return o;
+               }
+       },
+
+       languages: {
+               extend: function (id, redef) {
+                       var lang = _.util.clone(_.languages[id]);
+
+                       for (var key in redef) {
+                               lang[key] = redef[key];
+                       }
+
+                       return lang;
+               },
+
+               /**
+                * Insert a token before another token in a language literal
+                * As this needs to recreate the object (we cannot actually insert before keys in object literals),
+                * we cannot just provide an object, we need anobject and a key.
+                * @param inside The key (or language id) of the parent
+                * @param before The key to insert before. If not provided, the function appends instead.
+                * @param insert Object with the key/value pairs to insert
+                * @param root The object that contains `inside`. If equal to Prism.languages, it can be omitted.
+                */
+               insertBefore: function (inside, before, insert, root) {
+                       root = root || _.languages;
+                       var grammar = root[inside];
+                       
+                       if (arguments.length == 2) {
+                               insert = arguments[1];
+                               
+                               for (var newToken in insert) {
+                                       if (insert.hasOwnProperty(newToken)) {
+                                               grammar[newToken] = insert[newToken];
+                                       }
+                               }
+                               
+                               return grammar;
+                       }
+                       
+                       var ret = {};
+
+                       for (var token in grammar) {
+
+                               if (grammar.hasOwnProperty(token)) {
+
+                                       if (token == before) {
+
+                                               for (var newToken in insert) {
+
+                                                       if (insert.hasOwnProperty(newToken)) {
+                                                               ret[newToken] = insert[newToken];
+                                                       }
+                                               }
+                                       }
+
+                                       ret[token] = grammar[token];
+                               }
+                       }
+                       
+                       // Update references in other language definitions
+                       _.languages.DFS(_.languages, function(key, value) {
+                               if (value === root[inside] && key != inside) {
+                                       this[key] = ret;
+                               }
+                       });
+
+                       return root[inside] = ret;
+               },
+
+               // Traverse a language definition with Depth First Search
+               DFS: function(o, callback, type) {
+                       for (var i in o) {
+                               if (o.hasOwnProperty(i)) {
+                                       callback.call(o, i, o[i], type || i);
+
+                                       if (_.util.type(o[i]) === 'Object') {
+                                               _.languages.DFS(o[i], callback);
+                                       }
+                                       else if (_.util.type(o[i]) === 'Array') {
+                                               _.languages.DFS(o[i], callback, i);
+                                       }
+                               }
+                       }
+               }
+       },
+
+       highlightAll: function(async, callback) {
+               var elements = document.querySelectorAll('code[class*="language-"], [class*="language-"] code, code[class*="lang-"], [class*="lang-"] code');
+
+               for (var i=0, element; element = elements[i++];) {
+                       _.highlightElement(element, async === true, callback);
+               }
+       },
+
+       highlightElement: function(element, async, callback) {
+               // Find language
+               var language, grammar, parent = element;
+
+               while (parent && !lang.test(parent.className)) {
+                       parent = parent.parentNode;
+               }
+
+               if (parent) {
+                       language = (parent.className.match(lang) || [,''])[1];
+                       grammar = _.languages[language];
+               }
+
+               // Set language on the element, if not present
+               element.className = element.className.replace(lang, '').replace(/\s+/g, ' ') + ' language-' + language;
+
+               // Set language on the parent, for styling
+               parent = element.parentNode;
+
+               if (/pre/i.test(parent.nodeName)) {
+                       parent.className = parent.className.replace(lang, '').replace(/\s+/g, ' ') + ' language-' + language;
+               }
+
+               if (!grammar) {
+                       return;
+               }
+
+               var code = element.textContent;
+
+               if(!code) {
+                       return;
+               }
+
+               code = code.replace(/^(?:\r?\n|\r)/,'');
+
+               var env = {
+                       element: element,
+                       language: language,
+                       grammar: grammar,
+                       code: code
+               };
+
+               _.hooks.run('before-highlight', env);
+
+               if (async && _self.Worker) {
+                       var worker = new Worker(_.filename);
+
+                       worker.onmessage = function(evt) {
+                               env.highlightedCode = Token.stringify(JSON.parse(evt.data), language);
+
+                               _.hooks.run('before-insert', env);
+
+                               env.element.innerHTML = env.highlightedCode;
+
+                               callback && callback.call(env.element);
+                               _.hooks.run('after-highlight', env);
+                       };
+
+                       worker.postMessage(JSON.stringify({
+                               language: env.language,
+                               code: env.code
+                       }));
+               }
+               else {
+                       env.highlightedCode = _.highlight(env.code, env.grammar, env.language);
+
+                       _.hooks.run('before-insert', env);
+
+                       env.element.innerHTML = env.highlightedCode;
+
+                       callback && callback.call(element);
+
+                       _.hooks.run('after-highlight', env);
+               }
+       },
+
+       highlight: function (text, grammar, language) {
+               var tokens = _.tokenize(text, grammar);
+               return Token.stringify(_.util.encode(tokens), language);
+       },
+
+       tokenize: function(text, grammar, language) {
+               var Token = _.Token;
+
+               var strarr = [text];
+
+               var rest = grammar.rest;
+
+               if (rest) {
+                       for (var token in rest) {
+                               grammar[token] = rest[token];
+                       }
+
+                       delete grammar.rest;
+               }
+
+               tokenloop: for (var token in grammar) {
+                       if(!grammar.hasOwnProperty(token) || !grammar[token]) {
+                               continue;
+                       }
+
+                       var patterns = grammar[token];
+                       patterns = (_.util.type(patterns) === "Array") ? patterns : [patterns];
+
+                       for (var j = 0; j < patterns.length; ++j) {
+                               var pattern = patterns[j],
+                                       inside = pattern.inside,
+                                       lookbehind = !!pattern.lookbehind,
+                                       lookbehindLength = 0,
+                                       alias = pattern.alias;
+
+                               pattern = pattern.pattern || pattern;
+
+                               for (var i=0; i<strarr.length; i++) { // Don’t cache length as it changes during the loop
+
+                                       var str = strarr[i];
+
+                                       if (strarr.length > text.length) {
+                                               // Something went terribly wrong, ABORT, ABORT!
+                                               break tokenloop;
+                                       }
+
+                                       if (str instanceof Token) {
+                                               continue;
+                                       }
+
+                                       pattern.lastIndex = 0;
+
+                                       var match = pattern.exec(str);
+
+                                       if (match) {
+                                               if(lookbehind) {
+                                                       lookbehindLength = match[1].length;
+                                               }
+
+                                               var from = match.index - 1 + lookbehindLength,
+                                                       match = match[0].slice(lookbehindLength),
+                                                       len = match.length,
+                                                       to = from + len,
+                                                       before = str.slice(0, from + 1),
+                                                       after = str.slice(to + 1);
+
+                                               var args = [i, 1];
+
+                                               if (before) {
+                                                       args.push(before);
+                                               }
+
+                                               var wrapped = new Token(token, inside? _.tokenize(match, inside) : match, alias);
+
+                                               args.push(wrapped);
+
+                                               if (after) {
+                                                       args.push(after);
+                                               }
+
+                                               Array.prototype.splice.apply(strarr, args);
+                                       }
+                               }
+                       }
+               }
+
+               return strarr;
+       },
+
+       hooks: {
+               all: {},
+
+               add: function (name, callback) {
+                       var hooks = _.hooks.all;
+
+                       hooks[name] = hooks[name] || [];
+
+                       hooks[name].push(callback);
+               },
+
+               run: function (name, env) {
+                       var callbacks = _.hooks.all[name];
+
+                       if (!callbacks || !callbacks.length) {
+                               return;
+                       }
+
+                       for (var i=0, callback; callback = callbacks[i++];) {
+                               callback(env);
+                       }
+               }
+       }
+};
+
+var Token = _.Token = function(type, content, alias) {
+       this.type = type;
+       this.content = content;
+       this.alias = alias;
+};
+
+Token.stringify = function(o, language, parent) {
+       if (typeof o == 'string') {
+               return o;
+       }
+
+       if (_.util.type(o) === 'Array') {
+               return o.map(function(element) {
+                       return Token.stringify(element, language, o);
+               }).join('');
+       }
+
+       var env = {
+               type: o.type,
+               content: Token.stringify(o.content, language, parent),
+               tag: 'span',
+               classes: ['token', o.type],
+               attributes: {},
+               language: language,
+               parent: parent
+       };
+
+       if (env.type == 'comment') {
+               env.attributes['spellcheck'] = 'true';
+       }
+
+       if (o.alias) {
+               var aliases = _.util.type(o.alias) === 'Array' ? o.alias : [o.alias];
+               Array.prototype.push.apply(env.classes, aliases);
+       }
+
+       _.hooks.run('wrap', env);
+
+       var attributes = '';
+
+       for (var name in env.attributes) {
+               attributes += name + '="' + (env.attributes[name] || '') + '"';
+       }
+
+       return '<' + env.tag + ' class="' + env.classes.join(' ') + '" ' + attributes + '>' + env.content + '</' + env.tag + '>';
+
+};
+
+if (!_self.document) {
+       if (!_self.addEventListener) {
+               // in Node.js
+               return _self.Prism;
+       }
+       // In worker
+       _self.addEventListener('message', function(evt) {
+               var message = JSON.parse(evt.data),
+                   lang = message.language,
+                   code = message.code;
+
+               _self.postMessage(JSON.stringify(_.util.encode(_.tokenize(code, _.languages[lang]))));
+               _self.close();
+       }, false);
+
+       return _self.Prism;
+}
+
+// Get current script and highlight
+var script = document.getElementsByTagName('script');
+
+script = script[script.length - 1];
+
+if (script) {
+       _.filename = script.src;
+
+       if (document.addEventListener && !script.hasAttribute('data-manual')) {
+               document.addEventListener('DOMContentLoaded', _.highlightAll);
+       }
+}
+
+return _self.Prism;
+
+})();
+
+if (typeof module !== 'undefined' && module.exports) {
+       module.exports = Prism;
+}
+;
+Prism.languages.markup = {
+       'comment': /<!--[\w\W]*?-->/,
+       'prolog': /<\?[\w\W]+?\?>/,
+       'doctype': /<!DOCTYPE[\w\W]+?>/,
+       'cdata': /<!\[CDATA\[[\w\W]*?]]>/i,
+       'tag': {
+               pattern: /<\/?[^\s>\/]+(?:\s+[^\s>\/=]+(?:=(?:("|')(?:\\\1|\\?(?!\1)[\w\W])*\1|[^\s'">=]+))?)*\s*\/?>/i,
+               inside: {
+                       'tag': {
+                               pattern: /^<\/?[^\s>\/]+/i,
+                               inside: {
+                                       'punctuation': /^<\/?/,
+                                       'namespace': /^[^\s>\/:]+:/
+                               }
+                       },
+                       'attr-value': {
+                               pattern: /=(?:('|")[\w\W]*?(\1)|[^\s>]+)/i,
+                               inside: {
+                                       'punctuation': /[=>"']/
+                               }
+                       },
+                       'punctuation': /\/?>/,
+                       'attr-name': {
+                               pattern: /[^\s>\/]+/,
+                               inside: {
+                                       'namespace': /^[^\s>\/:]+:/
+                               }
+                       }
+
+               }
+       },
+       'entity': /&#?[\da-z]{1,8};/i
+};
+
+// Plugin to make entity title show the real entity, idea by Roman Komarov
+Prism.hooks.add('wrap', function(env) {
+
+       if (env.type === 'entity') {
+               env.attributes['title'] = env.content.replace(/&amp;/, '&');
+       }
+});
+;
+Prism.languages.css = {
+       'comment': /\/\*[\w\W]*?\*\//,
+       'atrule': {
+               pattern: /@[\w-]+?.*?(;|(?=\s*\{))/i,
+               inside: {
+                       'rule': /@[\w-]+/
+                       // See rest below
+               }
+       },
+       'url': /url\((?:(["'])(\\(?:\r\n|[\w\W])|(?!\1)[^\\\r\n])*\1|.*?)\)/i,
+       'selector': /[^\{\}\s][^\{\};]*?(?=\s*\{)/,
+       'string': /("|')(\\(?:\r\n|[\w\W])|(?!\1)[^\\\r\n])*\1/,
+       'property': /(\b|\B)[\w-]+(?=\s*:)/i,
+       'important': /\B!important\b/i,
+       'function': /[-a-z0-9]+(?=\()/i,
+       'punctuation': /[(){};:]/
+};
+
+Prism.languages.css['atrule'].inside.rest = Prism.util.clone(Prism.languages.css);
+
+if (Prism.languages.markup) {
+       Prism.languages.insertBefore('markup', 'tag', {
+               'style': {
+                       pattern: /<style[\w\W]*?>[\w\W]*?<\/style>/i,
+                       inside: {
+                               'tag': {
+                                       pattern: /<style[\w\W]*?>|<\/style>/i,
+                                       inside: Prism.languages.markup.tag.inside
+                               },
+                               rest: Prism.languages.css
+                       },
+                       alias: 'language-css'
+               }
+       });
+       
+       Prism.languages.insertBefore('inside', 'attr-value', {
+               'style-attr': {
+                       pattern: /\s*style=("|').*?\1/i,
+                       inside: {
+                               'attr-name': {
+                                       pattern: /^\s*style/i,
+                                       inside: Prism.languages.markup.tag.inside
+                               },
+                               'punctuation': /^\s*=\s*['"]|['"]\s*$/,
+                               'attr-value': {
+                                       pattern: /.+/i,
+                                       inside: Prism.languages.css
+                               }
+                       },
+                       alias: 'language-css'
+               }
+       }, Prism.languages.markup.tag);
+};
+Prism.languages.clike = {
+       'comment': [
+               {
+                       pattern: /(^|[^\\])\/\*[\w\W]*?\*\//,
+                       lookbehind: true
+               },
+               {
+                       pattern: /(^|[^\\:])\/\/.*/,
+                       lookbehind: true
+               }
+       ],
+       'string': /("|')(\\(?:\r\n|[\s\S])|(?!\1)[^\\\r\n])*\1/,
+       'class-name': {
+               pattern: /((?:(?:class|interface|extends|implements|trait|instanceof|new)\s+)|(?:catch\s+\())[a-z0-9_\.\\]+/i,
+               lookbehind: true,
+               inside: {
+                       punctuation: /(\.|\\)/
+               }
+       },
+       'keyword': /\b(if|else|while|do|for|return|in|instanceof|function|new|try|throw|catch|finally|null|break|continue)\b/,
+       'boolean': /\b(true|false)\b/,
+       'function': /[a-z0-9_]+(?=\()/i,
+       'number': /\b-?(0x[\dA-Fa-f]+|\d*\.?\d+([Ee]-?\d+)?)\b/,
+       'operator': /[-+]{1,2}|!|<=?|>=?|={1,3}|&{1,2}|\|?\||\?|\*|\/|~|\^|%/,
+       'punctuation': /[{}[\];(),.:]/
+};
+;
+Prism.languages.javascript = Prism.languages.extend('clike', {
+       'keyword': /\b(as|async|await|break|case|catch|class|const|continue|debugger|default|delete|do|else|enum|export|extends|false|finally|for|from|function|get|if|implements|import|in|instanceof|interface|let|new|null|of|package|private|protected|public|return|set|static|super|switch|this|throw|true|try|typeof|var|void|while|with|yield)\b/,
+       'number': /\b-?(0x[\dA-Fa-f]+|0b[01]+|0o[0-7]+|\d*\.?\d+([Ee][+-]?\d+)?|NaN|Infinity)\b/,
+       'function': /(?!\d)[a-z0-9_$]+(?=\()/i
+});
+
+Prism.languages.insertBefore('javascript', 'keyword', {
+       'regex': {
+               pattern: /(^|[^/])\/(?!\/)(\[.+?]|\\.|[^/\\\r\n])+\/[gimyu]{0,5}(?=\s*($|[\r\n,.;})]))/,
+               lookbehind: true
+       }
+});
+
+Prism.languages.insertBefore('javascript', 'class-name', {
+       'template-string': {
+               pattern: /`(?:\\`|\\?[^`])*`/,
+               inside: {
+                       'interpolation': {
+                               pattern: /\$\{[^}]+\}/,
+                               inside: {
+                                       'interpolation-punctuation': {
+                                               pattern: /^\$\{|\}$/,
+                                               alias: 'punctuation'
+                                       },
+                                       rest: Prism.languages.javascript
+                               }
+                       },
+                       'string': /[\s\S]+/
+               }
+       }
+});
+
+if (Prism.languages.markup) {
+       Prism.languages.insertBefore('markup', 'tag', {
+               'script': {
+                       pattern: /<script[\w\W]*?>[\w\W]*?<\/script>/i,
+                       inside: {
+                               'tag': {
+                                       pattern: /<script[\w\W]*?>|<\/script>/i,
+                                       inside: Prism.languages.markup.tag.inside
+                               },
+                               rest: Prism.languages.javascript
+                       },
+                       alias: 'language-javascript'
+               }
+       });
+}
+;
diff --git a/debian/patches/add-paths-override.patch b/debian/patches/add-paths-override.patch
new file mode 100644 (file)
index 0000000..ba97b5d
--- /dev/null
@@ -0,0 +1,56 @@
+From: Luca Bruno <lucab@debian.org>
+Description: Add overrides for stage1 build dependencies
+Forwarded: not-needed
+--- /dev/null
++++ b/.cargo/config
+@@ -0,0 +1,50 @@
++paths = [
++"./deps/advapi32-sys-0.1.1",
++"./deps/aho-corasick-0.2.1",
++"./deps/bitflags-0.1.1",
++"./deps/curl-0.2.10",
++"./deps/curl-sys-0.1.24",
++"./deps/docopt-0.6.67",
++"./deps/encoding-0.2.32",
++"./deps/encoding-index-japanese-1.20141219.5",
++"./deps/encoding-index-korean-1.20141219.5",
++"./deps/encoding-index-simpchinese-1.20141219.5",
++"./deps/encoding-index-singlebyte-1.20141219.5",
++"./deps/encoding-index-tradchinese-1.20141219.5",
++"./deps/encoding_index_tests-0.1.4",
++"./deps/env_logger-0.3.1",
++"./deps/filetime-0.1.4",
++"./deps/flate2-0.2.7",
++"./deps/gcc-0.3.8",
++"./deps/git2-0.2.12",
++"./deps/git2-curl-0.2.4",
++"./deps/glob-0.2.10",
++"./deps/kernel32-sys-0.1.2",
++"./deps/libc-0.1.8",
++"./deps/libgit2-sys-0.2.17",
++"./deps/libssh2-sys-0.1.25",
++"./deps/libz-sys-0.1.6",
++"./deps/log-0.3.1",
++"./deps/matches-0.1.2",
++"./deps/memchr-0.1.3",
++"./deps/miniz-sys-0.1.5",
++"./deps/num_cpus-0.2.6",
++"./deps/openssl-sys-0.6.3",
++"./deps/pkg-config-0.3.5",
++"./deps/regex-0.1.38",
++"./deps/regex-syntax-0.1.2",
++"./deps/rustc-serialize-0.3.15",
++"./deps/semver-0.1.19",
++"./deps/strsim-0.3.0",
++"./deps/tar-0.2.14",
++"./deps/term-0.2.9",
++"./deps/threadpool-0.1.4",
++"./deps/time-0.1.26",
++"./deps/toml-0.1.21",
++"./deps/url-0.2.35",
++"./deps/winapi-0.1.23",
++"./deps/winapi-build-0.1.0",
++]
++
++[registry]
++index = "file://--TOPDIR--/index/"
diff --git a/debian/patches/remove-cargo-devdeps.patch b/debian/patches/remove-cargo-devdeps.patch
new file mode 100644 (file)
index 0000000..632d1bc
--- /dev/null
@@ -0,0 +1,25 @@
+Fromm: Luca Bruno <lucab@debian.org>
+Description: Remove stage1 extra dependencies
+ Fetching extra dev-dependencies is not yet supported by bootstrap.py
+ and they are not need to build. However, cargo will try to download
+ them before building stage1.
+Forwarded: not-needed
+--- a/Cargo.toml
++++ b/Cargo.toml
+@@ -36,11 +36,11 @@
+ url = "0.2"
+ winapi = "0.1"
+-[dev-dependencies]
+-tempdir = "0.3"
+-hamcrest = { git = "https://github.com/carllerche/hamcrest-rust.git" }
+-bufstream = "0.1"
+-filetime = "0.1"
++#[dev-dependencies]
++#tempdir = "0.3"
++#hamcrest = { git = "https://github.com/carllerche/hamcrest-rust.git" }
++#bufstream = "0.1"
++#filetime = "0.1"
+ [[bin]]
+ name = "cargo"
diff --git a/debian/patches/remove-deps-path.patch b/debian/patches/remove-deps-path.patch
new file mode 100644 (file)
index 0000000..89af340
--- /dev/null
@@ -0,0 +1,193 @@
+From: Luca Bruno <lucab@debian.org>
+Description: Fix relative paths in build dependencies
+Bug: https://github.com/rust-lang/cargo/issues/1863
+Forwarded: not-needed
+--- a/deps/advapi32-sys-0.1.1/Cargo.toml
++++ b/deps/advapi32-sys-0.1.1/Cargo.toml
+@@ -15,6 +15,6 @@
+ name = "advapi32"
+ [dependencies]
+-winapi = { version = "*", path = "../.." }
++winapi = { version = "*" }
+ [build-dependencies]
+-winapi-build = { version = "*", path = "../../build" }
++winapi-build = { version = "*" }
+--- a/deps/curl-0.2.10/Cargo.toml
++++ b/deps/curl-0.2.10/Cargo.toml
+@@ -11,7 +11,7 @@
+ url = "0.2.0"
+ log = "0.3.0"
+ libc = "0.1"
+-curl-sys = { path = "curl-sys", version = "0.1.0" }
++curl-sys = { version = "0.1.0" }
+ [dev-dependencies]
+ env_logger = "0.3.0"
+--- a/deps/encoding-0.2.32/Cargo.toml
++++ b/deps/encoding-0.2.32/Cargo.toml
+@@ -24,23 +24,18 @@
+ [dependencies.encoding-index-singlebyte]
+ version = "~1.20141219.5"
+-path = "src/index/singlebyte"
+ [dependencies.encoding-index-korean]
+ version = "~1.20141219.5"
+-path = "src/index/korean"
+ [dependencies.encoding-index-japanese]
+ version = "~1.20141219.5"
+-path = "src/index/japanese"
+ [dependencies.encoding-index-simpchinese]
+ version = "~1.20141219.5"
+-path = "src/index/simpchinese"
+ [dependencies.encoding-index-tradchinese]
+ version = "~1.20141219.5"
+-path = "src/index/tradchinese"
+ [dev-dependencies]
+ getopts = "*" # for examples
+--- a/deps/encoding-index-japanese-1.20141219.5/Cargo.toml
++++ b/deps/encoding-index-japanese-1.20141219.5/Cargo.toml
+@@ -15,4 +15,3 @@
+ [dependencies.encoding_index_tests]
+ # TODO consider using dev-dependencies instead (Cargo issue #860)
+ version = "0.1.4"
+-path = "../tests"
+--- a/deps/encoding-index-korean-1.20141219.5/Cargo.toml
++++ b/deps/encoding-index-korean-1.20141219.5/Cargo.toml
+@@ -15,4 +15,3 @@
+ [dependencies.encoding_index_tests]
+ # TODO consider using dev-dependencies instead (Cargo issue #860)
+ version = "0.1.4"
+-path = "../tests"
+--- a/deps/encoding-index-simpchinese-1.20141219.5/Cargo.toml
++++ b/deps/encoding-index-simpchinese-1.20141219.5/Cargo.toml
+@@ -15,4 +15,3 @@
+ [dependencies.encoding_index_tests]
+ # TODO consider using dev-dependencies instead (Cargo issue #860)
+ version = "0.1.4"
+-path = "../tests"
+--- a/deps/encoding-index-singlebyte-1.20141219.5/Cargo.toml
++++ b/deps/encoding-index-singlebyte-1.20141219.5/Cargo.toml
+@@ -15,4 +15,3 @@
+ [dependencies.encoding_index_tests]
+ # TODO consider using dev-dependencies instead (Cargo issue #860)
+ version = "0.1.4"
+-path = "../tests"
+--- a/deps/encoding-index-tradchinese-1.20141219.5/Cargo.toml
++++ b/deps/encoding-index-tradchinese-1.20141219.5/Cargo.toml
+@@ -15,4 +15,3 @@
+ [dependencies.encoding_index_tests]
+ # TODO consider using dev-dependencies instead (Cargo issue #860)
+ version = "0.1.4"
+-path = "../tests"
+--- a/deps/env_logger-0.3.1/Cargo.toml
++++ b/deps/env_logger-0.3.1/Cargo.toml
+@@ -13,7 +13,6 @@
+ [dependencies.log]
+ version = "0.3"
+-path = ".."
+ [dependencies]
+ regex = "0.1"
+--- a/deps/flate2-0.2.7/Cargo.toml
++++ b/deps/flate2-0.2.7/Cargo.toml
+@@ -17,7 +17,7 @@
+ [dependencies]
+ libc = "0.1"
+-miniz-sys = { path = "miniz-sys", version = "0.1" }
++miniz-sys = { version = "0.1" }
+ [dev-dependencies]
+ rand = "0.3"
+--- a/deps/git2-0.2.12/Cargo.toml
++++ b/deps/git2-0.2.12/Cargo.toml
+@@ -19,7 +19,7 @@
+ url = "0.2"
+ bitflags = "0.1"
+ libc = "0.1"
+-libgit2-sys = { path = "libgit2-sys", version = "0.2.3" }
++libgit2-sys = { version = "0.2.3" }
+ [dev-dependencies]
+ docopt = "0.6"
+--- a/deps/git2-curl-0.2.4/Cargo.toml
++++ b/deps/git2-curl-0.2.4/Cargo.toml
+@@ -19,7 +19,6 @@
+ log = "0.3"
+ [dependencies.git2]
+-path = ".."
+ version = "0.2"
+ [dev-dependencies]
+--- a/deps/kernel32-sys-0.1.2/Cargo.toml
++++ b/deps/kernel32-sys-0.1.2/Cargo.toml
+@@ -15,6 +15,6 @@
+ name = "kernel32"
+ [dependencies]
+-winapi = { version = "*", path = "../.." }
++winapi = { version = "*" }
+ [build-dependencies]
+-winapi-build = { version = "*", path = "../../build" }
++winapi-build = { version = "*" }
+--- a/deps/regex-0.1.38/Cargo.toml
++++ b/deps/regex-0.1.38/Cargo.toml
+@@ -32,7 +32,7 @@
+ [dependencies]
+ aho-corasick = "0.2"
+ memchr = "0.1"
+-regex-syntax = { path = "regex-syntax", version = "0.1" }
++regex-syntax = { version = "0.1" }
+ [dev-dependencies]
+ rand = "0.3"
+--- a/deps/winapi-0.1.23/Cargo.toml
++++ b/deps/winapi-0.1.23/Cargo.toml
+@@ -13,21 +13,21 @@
+ libc = "*"
+ [dev-dependencies]
+-advapi32-sys = { version = "*", path = "lib/advapi32-sys" }
+-crypt32-sys = { version = "*", path = "lib/crypt32-sys" }
+-d3d9-sys = { version = "*", path = "lib/d3d9-sys" }
+-dbghelp-sys = { version = "*", path = "lib/dbghelp-sys" }
+-dwmapi-sys = { version = "*", path = "lib/dwmapi-sys" }
+-gdi32-sys = { version = "*", path = "lib/gdi32-sys" }
+-kernel32-sys = { version = "*", path = "lib/kernel32-sys" }
+-ktmw32-sys = { version = "*", path = "lib/ktmw32-sys" }
+-mpr-sys = { version = "*", path = "lib/mpr-sys" }
+-ole32-sys = { version = "*", path = "lib/ole32-sys" }
+-opengl32-sys = { version = "*", path = "lib/opengl32-sys" }
+-psapi-sys = { version = "*", path = "lib/psapi-sys" }
+-secur32-sys = { version = "*", path = "lib/secur32-sys" }
+-shell32-sys = { version = "*", path = "lib/shell32-sys" }
+-user32-sys = { version = "*", path = "lib/user32-sys" }
+-uuid-sys = { version = "*", path = "lib/uuid-sys" }
+-winhttp-sys = { version = "*", path = "lib/winhttp-sys" }
+-winmm-sys = { version = "*", path = "lib/winmm-sys" }
++advapi32-sys = { version = "*" }
++crypt32-sys = { version = "*" }
++d3d9-sys = { version = "*" }
++dbghelp-sys = { version = "*" }
++dwmapi-sys = { version = "*" }
++gdi32-sys = { version = "*" }
++kernel32-sys = { version = "*" }
++ktmw32-sys = { version = "*" }
++mpr-sys = { version = "*" }
++ole32-sys = { version = "*" }
++opengl32-sys = { version = "*" }
++psapi-sys = { version = "*" }
++secur32-sys = { version = "*" }
++shell32-sys = { version = "*" }
++user32-sys = { version = "*" }
++uuid-sys = { version = "*" }
++winhttp-sys = { version = "*" }
++winmm-sys = { version = "*" }
diff --git a/debian/patches/series b/debian/patches/series
new file mode 100644 (file)
index 0000000..57dbd43
--- /dev/null
@@ -0,0 +1,3 @@
+add-paths-override.patch
+remove-deps-path.patch
+remove-cargo-devdeps.patch
diff --git a/debian/rules b/debian/rules
new file mode 100755 (executable)
index 0000000..b6e849b
--- /dev/null
@@ -0,0 +1,77 @@
+#!/usr/bin/make -f
+
+include /usr/share/dpkg/pkg-info.mk
+include /usr/share/dpkg/architecture.mk
+include /usr/share/dpkg/buildflags.mk
+RUSTFLAGS = -C link-args="$(LDFLAGS)"
+export CFLAGS CXXFLAGS CPPFLAGS LDFLAGS RUSTFLAGS
+
+rust_cpu = $(subst i586,i686,$(1))
+DEB_HOST_RUST_TYPE := $(call rust_cpu,$(DEB_HOST_GNU_CPU))-unknown-$(DEB_HOST_GNU_SYSTEM)
+DEB_TARGET_RUST_TYPE := $(call rust_cpu,$(DEB_TARGET_GNU_CPU))-unknown-$(DEB_TARGET_GNU_SYSTEM)
+
+# Cargo looks for config in and writes cache to $CARGO_HOME/
+export CARGO_HOME = $(CURDIR)/debian/tmp/cargo-home
+export GIT_AUTHOR_NAME="deb-build"
+export GIT_AUTHOR_EMAIL="<>"
+export GIT_COMMITTER_NAME="$(GIT_AUTHOR_NAME)"
+export GIT_COMMITTER_EMAIL="$(GIT_AUTHOR_EMAIL)"
+
+DEB_DESTDIR := $(CURDIR)/debian/tmp
+INDEXDIR := $(CURDIR)/index
+DEPSDIR := $(CURDIR)/deps
+
+%:
+       dh $@ --with bash-completion
+
+override_dh_auto_configure:
+       # crates index location must be an absolute URL
+       sed -i.bak 's|--TOPDIR--|$(CURDIR)|' .cargo/config
+
+override_dh_auto_build:
+       # Bootstrap cargo stage0
+       ./debian/bootstrap.py \
+               --no-clean \
+               --no-clone \
+               --crate-index $(INDEXDIR)/ \
+               --cargo-root $(CURDIR)/ \
+               --target-dir $(DEPSDIR)/ \
+               --host=$(DEB_HOST_RUST_TYPE) \
+               --target=$(DEB_TARGET_RUST_TYPE)
+       ln -s $(DEPSDIR)/cargo-* $(DEPSDIR)/cargo-stage0
+       # Walkaround - see https://github.com/rust-lang/cargo/issues/1423)
+       tar -czf quilt-pc.tar.gz .pc
+       $(RM) -r .pc
+       # Walkaround - crates index must be a git repo
+       cd $(INDEXDIR) && git init && git add . && git commit -m "Dummy commit"
+       # Configure to build cargo using the just-built stage0
+       ./configure \
+               --prefix=/usr \
+               --disable-debug \
+               --enable-optimize \
+               --local-rust-root=/usr \
+               --local-cargo=$(CURDIR)/deps/cargo-stage0
+       # Build final cargo binary and docs
+       $(MAKE)
+       $(MAKE) doc
+       # Restore from walkarounds
+       -tar -xaf quilt-pc.tar.gz && $(RM) -r quilt-pc.tar.gz
+       -$(RM) -r $(INDEXDIR)/.git
+
+override_dh_auto_clean:
+       -tar -xaf quilt-pc.tar.gz && $(RM) -r quilt-pc.tar.gz
+       -mv .cargo/config.bak .cargo/config
+       -$(RM) -r $(CURDIR)/deps/*.rlib \
+                       $(CURDIR)/deps/build_script* \
+                       $(CURDIR)/deps/cargo* \
+                       $(CURDIR)/deps/*.o \
+                       $(CURDIR)/target/
+       -$(RM) -r $(INDEXDIR)/.git
+       dh_auto_clean
+
+override_dh_auto_install:
+       # We pick stuff directly from target/
+
+override_dh_auto_test:
+       # we don't run tests at the moment
+
diff --git a/debian/source/format b/debian/source/format
new file mode 100644 (file)
index 0000000..163aaf8
--- /dev/null
@@ -0,0 +1 @@
+3.0 (quilt)
diff --git a/debian/watch b/debian/watch
new file mode 100644 (file)
index 0000000..1271921
--- /dev/null
@@ -0,0 +1,2 @@
+version=3
+https://github.com/rust-lang/cargo/releases /rust-lang/cargo/archive/(\d+\.\d+\.\d+)\.tar\.gz